Proprioceptive State Estimation for Amphibious Tactile Sensing


Xudong Han, Ning Guo, Shuqiao Zhong, Zhiyuan Zhou, Jian Lin, Chaoyang Song, Fang Wan: Proprioceptive State Estimation for Amphibious Tactile Sensing. IEEE International Conference on Robotics and Automation (ICRA2025), Atlanta, USA, 2025, (Dual-track Submission with TRO: https://doi.org/10.1109/TRO.2024.3463509).

Abstract

This paper presents a novel vision-based proprioception approach for a soft robotic finger that can estimate and reconstruct tactile interactions in terrestrial and aquatic environments. The key to this system lies in the finger's unique metamaterial structure, which facilitates omni-directional passive adaptation during grasping, protecting delicate objects across diverse scenarios. A compact in-finger camera captures high-framerate images of the finger's deformation during contact, extracting crucial tactile data in real time. We present a volumetric discretized model of the soft finger and use the geometry constraints captured by the camera to find the optimal estimation of the deformed shape. The approach is benchmarked using a motion capture system with sparse markers and a haptic device with dense measurements. Both results show state-of-the-art accuracies, with a median error of 1.96 mm for overall body deformation, corresponding to 2.1% of the finger's length. More importantly, the state estimation is robust in both on-land and underwater environments, as we demonstrate its usage for underwater object shape sensing. This combination of passive adaptation and real-time tactile sensing paves the way for amphibious robotic grasping applications. All codes are shared on GitHub: https://github.com/ancorasir/PropSE.

BibTeX (Download)

@conference{Han2025ProprioceptiveState,
title = {Proprioceptive State Estimation for Amphibious Tactile Sensing},
author = {Xudong Han and Ning Guo and Shuqiao Zhong and Zhiyuan Zhou and Jian Lin and Chaoyang Song and Fang Wan},
url = {https://github.com/ancorasir/PropSE},
doi = {10.1109/TRO.2024.3463509},
year  = {2025},
date = {2025-03-07},
urldate = {2025-03-07},
booktitle = {IEEE International Conference on Robotics and Automation (ICRA2025)},
address = {Atlanta, USA},
abstract = {This paper presents a novel vision-based proprioception approach for a soft robotic finger that can estimate and reconstruct tactile interactions in terrestrial and aquatic environments. The key to this system lies in the finger's unique metamaterial structure, which facilitates omni-directional passive adaptation during grasping, protecting delicate objects across diverse scenarios. A compact in-finger camera captures high-framerate images of the finger's deformation during contact, extracting crucial tactile data in real time. We present a volumetric discretized model of the soft finger and use the geometry constraints captured by the camera to find the optimal estimation of the deformed shape. The approach is benchmarked using a motion capture system with sparse markers and a haptic device with dense measurements. Both results show state-of-the-art accuracies, with a median error of 1.96 mm for overall body deformation, corresponding to 2.1% of the finger's length. More importantly, the state estimation is robust in both on-land and underwater environments, as we demonstrate its usage for underwater object shape sensing. This combination of passive adaptation and real-time tactile sensing paves the way for amphibious robotic grasping applications. All codes are shared on GitHub: https://github.com/ancorasir/PropSE.},
note = {Dual-track Submission with TRO: https://doi.org/10.1109/TRO.2024.3463509},
keywords = {Authorship - Corresponding, Conf - ICRA, Special - Dual-Track},
pubstate = {published},
tppubtype = {conference}
}

Powered By EmbedPress

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *