Proprioceptive State Estimation for Amphibious Tactile Sensing


Ning Guo, Xudong Han, Shuqiao Zhong, Zhiyuan Zhou, Jian Lin, Jiansheng Dai, Fang Wan, Chaoyang Song: Proprioceptive State Estimation for Amphibious Tactile Sensing. In: IEEE Transactions on Robotics, vol. 40, iss. September, pp. 4684-4698, 2024.

Abstract

This article presents a novel vision-based proprioception approach for a soft robotic finger that can estimate and reconstruct tactile interactions in terrestrial and aquatic environments. The key to this system lies in the finger's unique metamaterial structure, which facilitates omnidirectional passive adaptation during grasping, protecting delicate objects across diverse scenarios. A compact in-finger camera captures high-framerate images of the finger's deformation during contact, extracting crucial tactile data in real time. We present a volumetric discretized model of the soft finger and use the geometry constraints captured by the camera to find the optimal estimation of the deformed shape. The approach is benchmarked using a motion capture system with sparse markers and a haptic device with dense measurements. Both results show state-of-the-art accuracy, with a median error of 1.96 mm for overall body deformation, corresponding to 2.1 % of the finger's length. More importantly, the state estimation is robust in both on-land and underwater environments as we demonstrate its usage for underwater object shape sensing. This combination of passive adaptation and real-time tactile sensing paves the way for amphibious robotic grasping applications.

BibTeX (Download)

@article{Guo2024ProprioceptiveState,
title = {Proprioceptive State Estimation for Amphibious Tactile Sensing},
author = {Ning Guo and Xudong Han and Shuqiao Zhong and Zhiyuan Zhou and Jian Lin and Jiansheng Dai and Fang Wan and Chaoyang Song},
doi = {10.1109/TRO.2024.3463509},
year  = {2024},
date = {2024-09-18},
urldate = {2024-09-18},
journal = {IEEE Transactions on Robotics},
volume = {40},
issue = {September},
pages = {4684-4698},
abstract = {This article presents a novel vision-based proprioception approach for a soft robotic finger that can estimate and reconstruct tactile interactions in terrestrial and aquatic environments. The key to this system lies in the finger's unique metamaterial structure, which facilitates omnidirectional passive adaptation during grasping, protecting delicate objects across diverse scenarios. A compact in-finger camera captures high-framerate images of the finger's deformation during contact, extracting crucial tactile data in real time. We present a volumetric discretized model of the soft finger and use the geometry constraints captured by the camera to find the optimal estimation of the deformed shape. The approach is benchmarked using a motion capture system with sparse markers and a haptic device with dense measurements. Both results show state-of-the-art accuracy, with a median error of 1.96 mm for overall body deformation, corresponding to 2.1 % of the finger's length. More importantly, the state estimation is robust in both on-land and underwater environments as we demonstrate its usage for underwater object shape sensing. This combination of passive adaptation and real-time tactile sensing paves the way for amphibious robotic grasping applications.},
key = {2024-J-TRO-ProprioceptiveState},
keywords = {Corresponding Author, IEEE Trans. Robot. (T-RO), JCR Q1},
pubstate = {published},
tppubtype = {article}
}