Xiaobo Liu|2019-2024|SUSTech Ph.D. Program
- Undergraduate: Beijing Institute of Technology
- Master: University of Chinese Academy of Sciences
- Research Area: Machine Vision, Robotic Grasping
- Supervisor: Dr. Chaoyang Song
- Doctoral Thesis Title: Proprioceptive Sensing of Soft Polyhedral Networks and Robot Manipulation Learning Based on Visual Methods
- Doctoral Thesis Committee:
- Dr. Zaiyue Yang, Professor, School of System Design and Intelligent Manufacturing, Southern University of Science and Technology
- Dr. Ping Wang, Associate Professor, School of Intelligent Systems Engineering, Sun Yat-Sen University
- Dr. Tin Lun Lam, Assistant Professor, Robotics & Intelligent Systems, The Chinese University of Hong Kong, Shenzhen
- Dr. Wei Zhang, Professor, School of System Design and Intelligent Manufacturing, Southern University of Science and Technology
- Dr. Hui Deng, Associate Professor, Department of Mechanical and Energy Engineering, Southern University of Science and Technology
- Selected Publications:
- Xiaobo Liu#, Xudong Han#, …, Chaoyang Song* (2024). “Proprioceptive Learning with Soft Polyhedral Networks.” The International Journal of Robotics Research (OnlineFirst).
- Xiaobo Liu, …, Chaoyang Song* (2023). “Bio-inspired Proprioceptive Touch of a Soft Finger with Inner-Finger Kinesthetic Perception.” Biomimetics, 8(6):501.
- Tianyu Wu#, Yujian Dong#, Xiaobo Liu#, …, Chaoyang Song* (2024). “Vision-based Tactile Intelligence with Soft Robotic Metamaterial.” Materials & Design, 238(February):112629.
- Haokun Wang#, Xiaobo Liu#, …, Chaoyang Song* (2022). “DeepClaw 2.0: A Data Collection Platform for Learning Human Manipulation.” Frontiers in Robotics and AI, 9:787291.
Abstract
Object manipulation is the most fundamental ability for robots to interact with objects. The robot gripper, as the primary physical interface with objects, directly impacts the robot's flexibility and efficiency in various application scenarios. Compared to traditional rigid grippers, soft grippers have higher flexibility, environmental adaptability, and human-machine interaction capabilities. The soft gripper can achieve stable grasping of objects with different shapes and damage-free handling of fragile objects, making them widely used in robot grasping and manipulation scenarios. However, their underactuation characteristic makes the control and perception of soft fingers more complex and challenging. This thesis focuses on proprioceptive and object perception based on soft fingers, proposing a design method for a class of omni-directional adaptive soft fingers called Soft Polyhedral Networks. This thesis addresses issues such as viscoelasticity, proprioception, robotic manipulation benchmark, and estimation of object poses in hand. By integrating ArUco markers and cameras into fingers, this thesis studies the proprioception with finger deformation. The soft finger is modeled and simulated with the Abaqus, and the simulation deformation data is collected. An MLP network is trained to reconstruct finger deformation from the deformation data, and transferred to actual physical interactions, demonstrating its Sim2Real capability. The static and dynamic viscoelastic characteristics are further analyzed. Compared to existing soft sensors, the model which considering viscoelastic characteristics has higher force prediction accuracy. To address the issue of uncertainty in the state of objects during manipulation, this thesis propose a method to estimating pose and category of in hand objects based on soft fingers and in-finger vision. This method is suitable for different types of grippers and tactile sensors and was tested on a two-finger gripper, achieving high classification and positioning accuracy. To solve the data generation and standardization problem in learning manipulation tasks, a benchmark experimental system for robot manipulation tasks is proposed, and a robot operation data collection platform called DeepClaw is bulit, which is shareable and reproducible. DeepClaw enables low-cost collection of manipulation trajectories and interaction forces, and a manipulation task dataset is created. In the context of object manipulation, by observing the deformation of the soft fingers, they were endowed with the ability to sense the contact state. This enabled the successful realization of real-time contact force and deformation sensing, leading to real-time object state estimation. Finally, manipulation experiments validated the effectiveness of this sensing technology, providing a technical foundation for dexterous object manipulation.
Links
BibTeX (Download)
@phdthesis{Liu2024PhDThesis, title = {Proprioceptive Sensing of Soft Polyhedral Networks and Robot Manipulation Learning Based on Visual Methods}, author = {Xiaobo Liu}, year = {2024}, date = {2024-08-29}, urldate = {2024-08-29}, school = {Southern University of Science and Technology}, abstract = {Object manipulation is the most fundamental ability for robots to interact with objects. The robot gripper, as the primary physical interface with objects, directly impacts the robot's flexibility and efficiency in various application scenarios. Compared to traditional rigid grippers, soft grippers have higher flexibility, environmental adaptability, and human-machine interaction capabilities. The soft gripper can achieve stable grasping of objects with different shapes and damage-free handling of fragile objects, making them widely used in robot grasping and manipulation scenarios. However, their underactuation characteristic makes the control and perception of soft fingers more complex and challenging. This thesis focuses on proprioceptive and object perception based on soft fingers, proposing a design method for a class of omni-directional adaptive soft fingers called Soft Polyhedral Networks. This thesis addresses issues such as viscoelasticity, proprioception, robotic manipulation benchmark, and estimation of object poses in hand. By integrating ArUco markers and cameras into fingers, this thesis studies the proprioception with finger deformation. The soft finger is modeled and simulated with the Abaqus, and the simulation deformation data is collected. An MLP network is trained to reconstruct finger deformation from the deformation data, and transferred to actual physical interactions, demonstrating its Sim2Real capability. The static and dynamic viscoelastic characteristics are further analyzed. Compared to existing soft sensors, the model which considering viscoelastic characteristics has higher force prediction accuracy. To address the issue of uncertainty in the state of objects during manipulation, this thesis propose a method to estimating pose and category of in hand objects based on soft fingers and in-finger vision. This method is suitable for different types of grippers and tactile sensors and was tested on a two-finger gripper, achieving high classification and positioning accuracy. To solve the data generation and standardization problem in learning manipulation tasks, a benchmark experimental system for robot manipulation tasks is proposed, and a robot operation data collection platform called DeepClaw is bulit, which is shareable and reproducible. DeepClaw enables low-cost collection of manipulation trajectories and interaction forces, and a manipulation task dataset is created. In the context of object manipulation, by observing the deformation of the soft fingers, they were endowed with the ability to sense the contact state. This enabled the successful realization of real-time contact force and deformation sensing, leading to real-time object state estimation. Finally, manipulation experiments validated the effectiveness of this sensing technology, providing a technical foundation for dexterous object manipulation.}, keywords = {Doctoral, Supervisor, SUSTech PhD Program, Thesis}, pubstate = {published}, tppubtype = {phdthesis} }