Ning Guo|2019-2024|SUSTech Ph.D. Program
- Undergraduate: Harbin Institute of Technology
- Master: Harbin Institute of Technology
- Research Area: Soft Robot Control, Robot Learning
- Supervisor: Dr. Chaoyang Song
- Doctoral Thesis Title: Soft Robotic Perception Mechanism via Vision-Based Tactile Reconstruction and its Amphibious Applications in Dexterous Manipulation
- Doctoral Thesis Committee:
- Dr. Zaiyue Yang, Professor, School of System Design and Intelligent Manufacturing, Southern University of Science and Technology
- Dr. Ping Wang, Associate Professor, School of Intelligent Systems Engineering, Sun Yat-Sen University
- Dr. Tin Lun Lam, Assistant Professor, Robotics & Intelligent Systems, The Chinese University of Hong Kong, Shenzhen
- Dr. Wei Zhang, Professor, School of System Design and Intelligent Manufacturing, Southern University of Science and Technology
- Dr. Hui Deng, Associate Professor, Department of Mechanical and Energy Engineering, Southern University of Science and Technology
- Selected Publications:
- Ning Guo#, Xudong Han#, Shuqiao Zhong#, …, Chaoyang Song* (2024). “Proprioceptive State Estimation for Amphibious Tactile Sensing.” IEEE Transactions on Robotics, Forthcoming, (Early Access).
- https://doi.org/10.1109/TRO.2024.3463509
- Special Issue on Tactile Robotics
- Ning Guo, …, Chaoyang Song* (2024). “Reconstructing Soft Robotic Touch via In-Finger Vision.” Advanced Intelligent Systems (OnlineFirst): 2400022.
- https://doi.org/10.1002/aisy.202400022
- Selected as the Front Cover for the October 2024 Issue.
- Ning Guo, …, Chaoyang Song* (2024). “Autoencoding a Soft Touch to Learn Grasping from On-land to Underwater.” Advanced Intelligent Systems, 6(1):2300382.
- https://doi.org/10.1002/aisy.202300382
- Selected as the Front Cover for the January 2024 Issue.
- Xudong Han, Ning Guo, …, Chaoyang Song* (2024). “On Flange-Based 3D Hand-Eye Calibration for Soft Robotic Tactile Welding.” Measurement, 238(October): 115376.
- Ning Guo#, Xudong Han#, Shuqiao Zhong#, …, Chaoyang Song* (2024). “Proprioceptive State Estimation for Amphibious Tactile Sensing.” IEEE Transactions on Robotics, Forthcoming, (Early Access).
Abstract
In recent years, with the rapid advancement of robotics technology, achieving dexterous manipulation in complex environments has become a focal point of research. Particularly in amphibious environments, robots need to possess high levels of perception and adaptability to cope with varying operational conditions. Against this backdrop, vision-based tactile technology, which integrates visual and tactile information, has shown great potential. This thesis focuses on the perceptual mechanisms of vision-based soft tactile sensing through deformation reconstruction and its application in dexterous manipulation of robots in amphibious environments. Soft robots can undergo continuous deformation when physically interacting with the external environment. These deformation characteristics of the soft structure body record rich tactile sensory information. Therefore, reconstructing the deformation of the soft structure in three-dimensional space is key to decoding tactile information. This paper establishes a multimodal sensing framework that couples the deformation energy of hyperelastic materials with the similarity of visual observation point features based on physical principles. Following the principle of minimizing total potential energy, the problem of solving the deformation field of hyperelastic medium in equilibrium state is formulated into a constrained optimization problem under visual observation. By constructing finite element spatial discretization and visual observation point constraints, an efficient numerical algorithm is proposed for real-time deformation reconstruction of the soft contact medium. To validate the effectiveness of the proposed perceptual mechanisms, this paper optimizes the hardware structure and integrates the visuotactile algorithm for a class of flexible fingers with omnidirectional adaptive capability, endowing passive robotic flexible fingers with active proprioceptive shape sensing and contact force distribution sensing capabilities. Simulation and experimental results demonstrate the real-time performance (<= 50 ms) and accuracy (<= 2.5 mm) of the proposed sensing method in reconstructing contact deformations of flexible fingers. The deformation state of the soft robotic fingers during contact is encoded in the form of visual image. However, it is extremely challenging to decode the three-dimensional deformation field of flexible structures from two-dimensional images to obtain tactile information. For complex cross-medium visual-tactile perception tasks, this paper proposes a supervised variational autoencoder learning framework to establish a visual-tactile cross-modal perception model. The deformation information of flexible fingers encoded by visual images and mechanical principles are jointly encoded and represented as interpretable latent variable features. These latent variable features are used to convert between visual and tactile modalities, resulting in a more generalizable visuotactile sensing mechanism. Experimental studies are carried out on datasets and amphibious environments, confirming the effectiveness and accuracy of the cross-medium inference model for vision-based tactile perception (R2 >= 0.98). The application research of visuotactile sensing information in robotic dexterous manipulation mainly includes adaptive grasping and environment exploration. However, applying visuotactile sensing technology in amphibious environments and studying robotic dexterous manipulation based on flexible visuotactile sensing is extremely rare. This paper studies the grasping performance of the flexible finger visuotactile sensing system in amphibious environments, establishing robotic adaptive grasping experimental platforms onland and underwater to verify the adaptability and superiority of the proposed sensing system in amphibious grasping operations. Additionally, for the study of the environment exploration sensing performance of the flexible finger visuotactile sensing system, experiments were completed on two-dimensional weld seam tracking in industrial scenarios and three-dimensional object shape sensing in underwater salvage scenarios, validating the accuracy and robustness of the proposed sensing system in amphibious environment exploration tasks. The proposed perceptual mechanisms of vision-based soft tactile sensing through deformation reconstruction provides theoretical guidance and design basis for the design of new visuotactile sensor hardware structures and sensing algorithms. The application of robotic dexterous manipulation in amphibious environments opens new research directions and application fields for visuotactile sensing technology.
Links
BibTeX (Download)
@phdthesis{Guo2024PhDThesis, title = {Soft Robotic Perception Mechanism via Vision-Based Tactile Reconstruction and its Amphibious Applications in Dexterous Manipulation}, author = {Ning Guo}, year = {2024}, date = {2024-08-29}, urldate = {2024-08-29}, school = {Southern University of Science and Technology}, abstract = {In recent years, with the rapid advancement of robotics technology, achieving dexterous manipulation in complex environments has become a focal point of research. Particularly in amphibious environments, robots need to possess high levels of perception and adaptability to cope with varying operational conditions. Against this backdrop, vision-based tactile technology, which integrates visual and tactile information, has shown great potential. This thesis focuses on the perceptual mechanisms of vision-based soft tactile sensing through deformation reconstruction and its application in dexterous manipulation of robots in amphibious environments. Soft robots can undergo continuous deformation when physically interacting with the external environment. These deformation characteristics of the soft structure body record rich tactile sensory information. Therefore, reconstructing the deformation of the soft structure in three-dimensional space is key to decoding tactile information. This paper establishes a multimodal sensing framework that couples the deformation energy of hyperelastic materials with the similarity of visual observation point features based on physical principles. Following the principle of minimizing total potential energy, the problem of solving the deformation field of hyperelastic medium in equilibrium state is formulated into a constrained optimization problem under visual observation. By constructing finite element spatial discretization and visual observation point constraints, an efficient numerical algorithm is proposed for real-time deformation reconstruction of the soft contact medium. To validate the effectiveness of the proposed perceptual mechanisms, this paper optimizes the hardware structure and integrates the visuotactile algorithm for a class of flexible fingers with omnidirectional adaptive capability, endowing passive robotic flexible fingers with active proprioceptive shape sensing and contact force distribution sensing capabilities. Simulation and experimental results demonstrate the real-time performance (<= 50 ms) and accuracy (<= 2.5 mm) of the proposed sensing method in reconstructing contact deformations of flexible fingers. The deformation state of the soft robotic fingers during contact is encoded in the form of visual image. However, it is extremely challenging to decode the three-dimensional deformation field of flexible structures from two-dimensional images to obtain tactile information. For complex cross-medium visual-tactile perception tasks, this paper proposes a supervised variational autoencoder learning framework to establish a visual-tactile cross-modal perception model. The deformation information of flexible fingers encoded by visual images and mechanical principles are jointly encoded and represented as interpretable latent variable features. These latent variable features are used to convert between visual and tactile modalities, resulting in a more generalizable visuotactile sensing mechanism. Experimental studies are carried out on datasets and amphibious environments, confirming the effectiveness and accuracy of the cross-medium inference model for vision-based tactile perception (R2 >= 0.98). The application research of visuotactile sensing information in robotic dexterous manipulation mainly includes adaptive grasping and environment exploration. However, applying visuotactile sensing technology in amphibious environments and studying robotic dexterous manipulation based on flexible visuotactile sensing is extremely rare. This paper studies the grasping performance of the flexible finger visuotactile sensing system in amphibious environments, establishing robotic adaptive grasping experimental platforms onland and underwater to verify the adaptability and superiority of the proposed sensing system in amphibious grasping operations. Additionally, for the study of the environment exploration sensing performance of the flexible finger visuotactile sensing system, experiments were completed on two-dimensional weld seam tracking in industrial scenarios and three-dimensional object shape sensing in underwater salvage scenarios, validating the accuracy and robustness of the proposed sensing system in amphibious environment exploration tasks. The proposed perceptual mechanisms of vision-based soft tactile sensing through deformation reconstruction provides theoretical guidance and design basis for the design of new visuotactile sensor hardware structures and sensing algorithms. The application of robotic dexterous manipulation in amphibious environments opens new research directions and application fields for visuotactile sensing technology.}, keywords = {Doctoral, Supervisor, SUSTech PhD Program, Thesis}, pubstate = {published}, tppubtype = {phdthesis} }