Vision-based Tactile Intelligence with Soft Robotic Metamaterial


Tianyu Wu, Yujian Dong, Xiaobo Liu, Xudong Han, Yang Xiao, Jinqi Wei, Fang Wan, Chaoyang Song: Vision-based Tactile Intelligence with Soft Robotic Metamaterial. In: Materials & Design, vol. 238, iss. February, pp. 112629, 2024.

Abstract

Robotic metamaterials represent an innovative approach to creating synthetic structures that combine desired material characteristics with embodied intelligence, blurring the boundaries between materials and machinery. Inspired by the functional qualities of biological skin, integrating tactile intelligence into these materials has gained significant interest for research and practical applications. This study introduces a Soft Robotic Metamaterial (SRM) design featuring omnidirectional adaptability and superior tactile sensing, combining vision-based motion tracking and machine learning. The study compares two sensory integration methods to a state-of-the-art motion tracking system and force/torque sensor baseline: an internal-vision design with high frame rates and an external-vision design offering cost-effectiveness. The results demonstrate the internal-vision SRM design achieving an impressive tactile accuracy of 98.96%, enabling soft and adaptive tactile interactions, especially beneficial for dexterous robotic grasping. The external-vision design offers similar performance at a reduced cost and can be adapted for portability, enhancing material science education and robotic learning. This research significantly advances tactile sensing using vision-based motion tracking in soft robotic metamaterials, and the open-source availability on GitHub fosters collaboration and further exploration of this innovative technology (https://github.com/bionicdl-sustech/SoftRoboticTongs).

BibTeX (Download)

@article{Wu2024VisionBasedSRM,
title = {Vision-based Tactile Intelligence with Soft Robotic Metamaterial},
author = {Tianyu Wu and Yujian Dong and Xiaobo Liu and Xudong Han and Yang Xiao and Jinqi Wei and Fang Wan and Chaoyang Song},
doi = {10.1016/j.matdes.2024.112629},
year  = {2024},
date = {2024-02-01},
urldate = {2024-02-01},
booktitle = {IEEE International Conference on Advanced Robotics and Mechatronics (ICARM2024)},
journal = {Materials & Design},
volume = {238},
issue = {February},
pages = {112629},
abstract = {Robotic metamaterials represent an innovative approach to creating synthetic structures that combine desired material characteristics with embodied intelligence, blurring the boundaries between materials and machinery. Inspired by the functional qualities of biological skin, integrating tactile intelligence into these materials has gained significant interest for research and practical applications. This study introduces a Soft Robotic Metamaterial (SRM) design featuring omnidirectional adaptability and superior tactile sensing, combining vision-based motion tracking and machine learning. The study compares two sensory integration methods to a state-of-the-art motion tracking system and force/torque sensor baseline: an internal-vision design with high frame rates and an external-vision design offering cost-effectiveness. The results demonstrate the internal-vision SRM design achieving an impressive tactile accuracy of 98.96%, enabling soft and adaptive tactile interactions, especially beneficial for dexterous robotic grasping. The external-vision design offers similar performance at a reduced cost and can be adapted for portability, enhancing material science education and robotic learning. This research significantly advances tactile sensing using vision-based motion tracking in soft robotic metamaterials, and the open-source availability on GitHub fosters collaboration and further exploration of this innovative technology (https://github.com/bionicdl-sustech/SoftRoboticTongs).},
keywords = {Corresponding Author, JCR Q1, Mat. Des. (MADE)},
pubstate = {published},
tppubtype = {article}
}