Anchoring Morphological Representations Unlocks Latent Proprioception in Soft Robots


Xudong Han, Ning Guo, Ronghan Xu, Chaoyang Song, Fang Wan: Anchoring Morphological Representations Unlocks Latent Proprioception in Soft Robots. Forthcoming, (Submitted to IEEE Transactions on Robotics).

Abstract

This research addresses the need for robust proprioceptive methods that capture the continuous deformations of soft robots without relying on multiple sensors that hinder compliance. We propose a bio-inspired strategy called textit{latent proprioception}, which anchors the robot's overall deformation state to a single internal reference frame tracked by a miniature onboard camera. Through a multi-modal neural network trained on simulated and real data, we unify motion, force, and shape measurements into a shared representation in textit{latent codes}, inferring unseen states from readily measured signals. Our experimental results show that this approach accurately reconstructs full-body deformations and forces from minimal sensing data, enabling soft robots to adapt to complex object manipulation or safe human interaction tasks. The proposed framework exemplifies how biological principles can inform and enhance robotics by reducing sensor complexity and preserving mechanical flexibility. We anticipate that such hybrid system codesign will advance robotic capabilities, deepen our understanding of natural movement, and potentially translate back into healthcare and wearable technologies for living beings. This work paves the way for soft robots endowed with greater autonomy and resilience. All codes are available at GitHub: https://github.com/ancorasir/ProSoRo.

BibTeX (Download)

@online{Han2025AnchoringMorphological,
title = {Anchoring Morphological Representations Unlocks Latent Proprioception in Soft Robots},
author = {Xudong Han and Ning Guo and Ronghan Xu and Chaoyang Song and Fang Wan},
url = {https://github.com/ancorasir/ProSoRo},
year  = {2025},
date = {2025-03-14},
abstract = {This research addresses the need for robust proprioceptive methods that capture the continuous deformations of soft robots without relying on multiple sensors that hinder compliance. We propose a bio-inspired strategy called textit{latent proprioception}, which anchors the robot's overall deformation state to a single internal reference frame tracked by a miniature onboard camera. Through a multi-modal neural network trained on simulated and real data, we unify motion, force, and shape measurements into a shared representation in textit{latent codes}, inferring unseen states from readily measured signals. Our experimental results show that this approach accurately reconstructs full-body deformations and forces from minimal sensing data, enabling soft robots to adapt to complex object manipulation or safe human interaction tasks. The proposed framework exemplifies how biological principles can inform and enhance robotics by reducing sensor complexity and preserving mechanical flexibility. We anticipate that such hybrid system codesign will advance robotic capabilities, deepen our understanding of natural movement, and potentially translate back into healthcare and wearable technologies for living beings. This work paves the way for soft robots endowed with greater autonomy and resilience. All codes are available at GitHub: https://github.com/ancorasir/ProSoRo.},
note = {Submitted to IEEE Transactions on Robotics},
keywords = {Authorship - Corresponding, Status - Under Review},
pubstate = {forthcoming},
tppubtype = {online}
}

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *