Linhan Yang|2019-2024|SUSTech-HKU Joint Ph.D. Program
- Undergraduate: Tsinghua University
- Research Area: Soft Manipulation, Robot Learning, Graph Neural Network
- Supervisor from the Southern University of Science and Technology: Dr. Chaoyang Song
- Supervisor from the University of Hong Kong: Dr. Jia Pan
- Co-Supervisor from the University of Hong Kong: Dr. Wenping Wang
- Doctoral Thesis Title: Rigid-Soft Interactive Learning for Robotic Manipulation
- Doctoral Thesis Committee:
- Dr. Zhiwen Zhang, Associate Professor, Department of Mathematics, The University of Hong Kong
- Dr. Hengshuang Zhao, Assistant Professor, Department of Computer Science, The University of Hong Kong
- Dr. Zaiyue Yang, Professor, School of System Design and Intelligent Manufacturing, Southern University of Science and Technology
- Dr. Ping Wang, Associate Professor, School of Intelligent Systems Engineering, Sun Yat-Sen University
- Selected Publications:
- Linhan Yang, …, Chaoyang Song*, Jia Pan* (2024). “One Fling to Goal: Environment-aware Dynamics for Goal-conditioned Fabric Flinging.” Workshop on the Algorithmic Foundations of Robotics (WAFR2024), Chicago, USA. (Accepted)
- Haoran Sun, Linhan Yang, …, Chaoyang Song*, Jia Pan* (2024). “CopGNN: Learning End-to-End Cloth Coverage Prediction via Graph Neural Networks.” IROS 2024 Workshop on Benchmarking via Competitions in Robotic Grasping and Manipulation, Abu Dhabi, UAE. (Accepted)
- Linhan Yang, …, Jia Pan, Chaoyang Song* (2021). “Learning-based Optoelectronically Innervated Tactile Finger for Rigid-Soft Interactive Grasping.” IEEE Robotics and Automation Letters, 6(2):3817–3824.
- Linhan Yang, …, Jia Pan, Chaoyang Song* (2020). “Rigid-Soft Interactive Learning for Robust Grasping.” IEEE Robotics and Automation Letters, 5(2):1720–1727.
Abstract
Recent years have witnessed significant advancements in the field of robotic manipulation through the adoption of machine learning methods. Unlike other domains such as computer vision and natural language processing, robotic manipulation involves complex physical interactions that pose substantial challenges for developing scalable and generalized control policies. In this thesis, we explore the understanding and represnetation learning of these interactions across various robotic manipulation scenarios. We classify these interactions into two categories: Internal Interactions between the manipulator (gripper or robot) and the objects, and External Interactions involving the objects/robots and their external environments. Focusing on the internal interactions, we initially investigate a grasp prediction task. We change the variables such as gripper stiffness (rigid or soft fingers) and the type of grasp (power or precision), which implicitly encodes interaction data within our dataset. Our experiments reveal that this configuration greatly improves the training speed and the grasping performance. Furthermore, these interactions can be explicitly represented through force and torque data, facilitated by equipping the finger surfaces with multi-channel optical fibers. We have developed an interactive grasp policy that utilizes local interaction data. The proprioceptive capabilities of the fingers enable them to conform to object contact regions, ensuring a stable grasp. We then extend our research to include dexterous in-hand manipulation, specifically rotating two spheres within the hand by 180 degrees. During this task, interactions between the objects and the hand are continuously disrupted and reformed. We utilize a hand equipped with four fingers and a tactile sensor array to gather comprehensive interaction data. To effectively represent this data, we introduce the TacGNN, a generalized model for tactile information across various shapes. This model allows us to achieve in-hand manipulation using solely proprioceptive tactile sensing. In our exploration of external interactions between objects/robots and external environments, we begin with a rigid-rigid interaction within a loco-manipulation problem. Our aim is to merge interaction data from both locomotion and manipulation into a unified graph-based framework, encapsulated within the graph representation. A shared control policy is then developed through simulations and directly transferred to real-world applications in a zero-shot manner. Additionally, we investigate rigid-soft interactions through a fabric manipulation task involving deformable objects. We have developed a graph-based, environment-aware representation for fabric, which integrates environmental data. This model logically encodes interaction data, enabling each fabric segment to detect and respond to environmental contact. Employing this strategy, we successfully execute a goal-conditioned manipulation task: placing the fabric in a specified configuration within complex scenarios on the first attempt.
Links
BibTeX (Download)
@phdthesis{Yang2024PhDThesis, title = {Rigid-Soft Interactive Learning for Robotic Manipulation}, author = {Linhan Yang}, year = {2024}, date = {2024-07-30}, urldate = {2024-07-30}, school = {Southern University of Science and Technology & The University of Hong Kong}, abstract = {Recent years have witnessed significant advancements in the field of robotic manipulation through the adoption of machine learning methods. Unlike other domains such as computer vision and natural language processing, robotic manipulation involves complex physical interactions that pose substantial challenges for developing scalable and generalized control policies. In this thesis, we explore the understanding and represnetation learning of these interactions across various robotic manipulation scenarios. We classify these interactions into two categories: Internal Interactions between the manipulator (gripper or robot) and the objects, and External Interactions involving the objects/robots and their external environments. Focusing on the internal interactions, we initially investigate a grasp prediction task. We change the variables such as gripper stiffness (rigid or soft fingers) and the type of grasp (power or precision), which implicitly encodes interaction data within our dataset. Our experiments reveal that this configuration greatly improves the training speed and the grasping performance. Furthermore, these interactions can be explicitly represented through force and torque data, facilitated by equipping the finger surfaces with multi-channel optical fibers. We have developed an interactive grasp policy that utilizes local interaction data. The proprioceptive capabilities of the fingers enable them to conform to object contact regions, ensuring a stable grasp. We then extend our research to include dexterous in-hand manipulation, specifically rotating two spheres within the hand by 180 degrees. During this task, interactions between the objects and the hand are continuously disrupted and reformed. We utilize a hand equipped with four fingers and a tactile sensor array to gather comprehensive interaction data. To effectively represent this data, we introduce the TacGNN, a generalized model for tactile information across various shapes. This model allows us to achieve in-hand manipulation using solely proprioceptive tactile sensing. In our exploration of external interactions between objects/robots and external environments, we begin with a rigid-rigid interaction within a loco-manipulation problem. Our aim is to merge interaction data from both locomotion and manipulation into a unified graph-based framework, encapsulated within the graph representation. A shared control policy is then developed through simulations and directly transferred to real-world applications in a zero-shot manner. Additionally, we investigate rigid-soft interactions through a fabric manipulation task involving deformable objects. We have developed a graph-based, environment-aware representation for fabric, which integrates environmental data. This model logically encodes interaction data, enabling each fabric segment to detect and respond to environmental contact. Employing this strategy, we successfully execute a goal-conditioned manipulation task: placing the fabric in a specified configuration within complex scenarios on the first attempt.}, keywords = {Doctoral, Supervisor, SUSTech-HKU Joint PhD Program, Thesis}, pubstate = {published}, tppubtype = {phdthesis} }