DeepClaw 2.0: A Data Collection Platform for Learning Human Manipulation


Haokun Wang, Xiaobo Liu, Nuofan Qiu, Ning Guo, Fang Wan, Chaoyang Song: DeepClaw 2.0: A Data Collection Platform for Learning Human Manipulation. In: Frontiers in Robotics and AI, vol. 9, pp. 787291, 2022.

Abstract

Besides direct interaction, human hands are also skilled at using tools to manipulate objects for typical life and work tasks. This paper proposes DeepClaw 2.0 as a low-cost, open-sourced data collection platform for learning human manipulation. We use an RGB-D camera to visually track the motion and deformation of a pair of soft finger networks on a modified kitchen tong operated by human teachers. These fingers can be easily integrated with robotic grippers to bridge the structural mismatch between humans and robots during learning. The deformation of soft finger networks, which reveals tactile information in contact-rich manipulation, is captured passively. We collected a comprehensive sample dataset involving five human demonstrators in ten manipulation tasks with five trials per task. As a low-cost, open-sourced platform, we also developed an intuitive interface that converts the raw sensor data into state-action data for imitation learning problems. For learning-by-demonstration problems, we further demonstrated our dataset’s potential by using real robotic hardware to collect joint actuation data or using a simulated environment when limited access to the hardware.

BibTeX (Download)

@article{Wang2022DeepClaw2.0,
title = {DeepClaw 2.0: A Data Collection Platform for Learning Human Manipulation},
author = {Haokun Wang and Xiaobo Liu and Nuofan Qiu and Ning Guo and Fang Wan and Chaoyang Song},
url = {Sec. Computational Intelligence in Robotics},
doi = {10.3389/frobt.2022.787291},
year  = {2022},
date = {2022-03-15},
urldate = {2022-03-15},
journal = {Frontiers in Robotics and AI},
volume = {9},
pages = {787291},
abstract = {Besides direct interaction, human hands are also skilled at using tools to manipulate objects for typical life and work tasks. This paper proposes DeepClaw 2.0 as a low-cost, open-sourced data collection platform for learning human manipulation. We use an RGB-D camera to visually track the motion and deformation of a pair of soft finger networks on a modified kitchen tong operated by human teachers. These fingers can be easily integrated with robotic grippers to bridge the structural mismatch between humans and robots during learning. The deformation of soft finger networks, which reveals tactile information in contact-rich manipulation, is captured passively. We collected a comprehensive sample dataset involving five human demonstrators in ten manipulation tasks with five trials per task. As a low-cost, open-sourced platform, we also developed an intuitive interface that converts the raw sensor data into state-action data for imitation learning problems. For learning-by-demonstration problems, we further demonstrated our dataset’s potential by using real robotic hardware to collect joint actuation data or using a simulated environment when limited access to the hardware.},
keywords = {Corresponding Author, Front. Robot. AI. (FROBT), JCR Q2},
pubstate = {published},
tppubtype = {article}
}