Vision-based Tactile Sensing for an Omni-adaptive Soft Finger


Xudong Han, Sheng Liu, Fang Wan, Chaoyang Song: Vision-based Tactile Sensing for an Omni-adaptive Soft Finger. IEEE International Conference on Development and Learning (ICDL2023), Macau SAR, 2023.

Abstract

Vision-based tactile sensing provides a novel solution to robotic proprioception using visual information to infer physical interaction on the contact surface. In this paper, we leveraged the omni-adaptive capability of a soft finger with differential stiffness by adding a monocular camera at its bottom to track its spatial deformation while interacting with objects. We modeled this soft finger's physical interaction and measured the stiffness distribution through experiments. The camera captured the soft finger's deformation when interacting with probes for different contact forces and positions. Using a neural network modified from AlexNet, we proposed a preliminary estimation model of the contact force and position using the captured images. The results show that the proposed method can achieve an accuracy of 90% for position estimation and a normalized root mean squared error of 3.4% for force estimation, showing the reliability and robustness of the proposed sensing method.

BibTeX (Download)

@conference{Han2023VisionBased,
title = {Vision-based Tactile Sensing for an Omni-adaptive Soft Finger},
author = {Xudong Han and Sheng Liu and Fang Wan and Chaoyang Song},
url = {https://www.proceedings.com/content/072/072332webtoc.pdf},
doi = {10.1109/ICDL55364.2023.10364455},
year  = {2023},
date = {2023-11-09},
urldate = {2023-11-09},
booktitle = {IEEE International Conference on Development and Learning (ICDL2023)},
address = {Macau SAR},
abstract = {Vision-based tactile sensing provides a novel solution to robotic proprioception using visual information to infer physical interaction on the contact surface. In this paper, we leveraged the omni-adaptive capability of a soft finger with differential stiffness by adding a monocular camera at its bottom to track its spatial deformation while interacting with objects. We modeled this soft finger's physical interaction and measured the stiffness distribution through experiments. The camera captured the soft finger's deformation when interacting with probes for different contact forces and positions. Using a neural network modified from AlexNet, we proposed a preliminary estimation model of the contact force and position using the captured images. The results show that the proposed method can achieve an accuracy of 90% for position estimation and a normalized root mean squared error of 3.4% for force estimation, showing the reliability and robustness of the proposed sensing method.},
keywords = {Corresponding Author, ICDL},
pubstate = {published},
tppubtype = {conference}
}