




Working Papers
Sorry, no publications matched your criteria.
Under Review
Sorry, no publications matched your criteria.
Journal Articles
Victor-Louis De Gusseme, Thomas Lips, Remko Proesmans, Julius Hietala, Giwan Lee, Jiyoung Choi, Jeongil Choi, Geon Kim, Phayuth Yonrith, Domen Tabernik, Andrej Gams, Peter Nimac, Matej Urbas, Jon Muhovic, Danijel Skocaj, Matija Mavsar, Hyojeong Yu, Minseo Kwon, Young J. Kim, Yang Cong, Ronghan Chen, Yu Ren, Supeng Diao, Jiawei Weng, Jiayue Liu, Haoran Sun, Linhan Yang, Zeqing Zhang, Ning Guo, Lei Yang, Fang Wan, Chaoyang Song, Jia Pan, Yixiang Jin, Yong A, Jun Shi, Dingzhe Li, Yong Yang, Kakeru Yamasaki, Takumi Kajiwara, Yuki Nakadera, Krati Saxena, Tomohiro Shibata, Chongkun Xia, Kai Mo, Yanzhao Yu, Qihao Lin, Binqiang Ma, Uihun Sagong, JungHyun Choi, JeongHyun Park, Dongwoo Lee, Yeongmin Kim, Myun Joong Hwang, Yusuke Kuribayashi, Naoki Hiratsuka, Daisuke Tanaka, Solvi Arnold, Kimitoshi Yamazaki, Carlos Mateo-Agullo, Andreas Verleysen, Francis wyffels
A Dataset and Benchmark for Robotic Cloth Unfolding Grasp Selection: The ICRA 2024 Cloth Competition Journal Article Forthcoming
In: The International Journal of Robotics Research, vol. 0, no. 0, pp. 0, Forthcoming, (Accepted).
Abstract | Links | BibTeX | Tags: Authorship - Co-Author, JCR Q2, Jour - Int. J. Robot. Res. (IJRR)
@article{DeGusseme2024BenchmarkingGrasp,
title = {A Dataset and Benchmark for Robotic Cloth Unfolding Grasp Selection: The ICRA 2024 Cloth Competition},
author = {Victor-Louis De Gusseme and Thomas Lips and Remko Proesmans and Julius Hietala and Giwan Lee and Jiyoung Choi and Jeongil Choi and Geon Kim and Phayuth Yonrith and Domen Tabernik and Andrej Gams and Peter Nimac and Matej Urbas and Jon Muhovic and Danijel Skocaj and Matija Mavsar and Hyojeong Yu and Minseo Kwon and Young J. Kim and Yang Cong and Ronghan Chen and Yu Ren and Supeng Diao and Jiawei Weng and Jiayue Liu and Haoran Sun and Linhan Yang and Zeqing Zhang and Ning Guo and Lei Yang and Fang Wan and Chaoyang Song and Jia Pan and Yixiang Jin and Yong A and Jun Shi and Dingzhe Li and Yong Yang and Kakeru Yamasaki and Takumi Kajiwara and Yuki Nakadera and Krati Saxena and Tomohiro Shibata and Chongkun Xia and Kai Mo and Yanzhao Yu and Qihao Lin and Binqiang Ma and Uihun Sagong and JungHyun Choi and JeongHyun Park and Dongwoo Lee and Yeongmin Kim and Myun Joong Hwang and Yusuke Kuribayashi and Naoki Hiratsuka and Daisuke Tanaka and Solvi Arnold and Kimitoshi Yamazaki and Carlos Mateo-Agullo and Andreas Verleysen and Francis wyffels},
url = {https://airo.ugent.be/cloth_competition/
https://doi.org/10.48550/arXiv.2508.16749},
doi = {10.1177/02783649251414885},
year = {2025},
date = {2025-01-10},
urldate = {2025-01-10},
journal = {The International Journal of Robotics Research},
volume = {0},
number = {0},
pages = {0},
abstract = {Robotic cloth manipulation suffers from a lack of standardized benchmarks and shared datasets for evaluating and comparing different approaches. To address this, we organized the ICRA 2024 Cloth Competition, a unique head-to-head evaluation focused on grasp pose selection for cloth unfolding. Eleven diverse teams competed with a shared dual-arm robot, utilizing our publicly released dataset of real-world robotic cloth unfolding attempts. We expanded this dataset with 176 live evaluation trials, which now encompasses 679 unfolding demonstrations across 34 garments. The competition established a key benchmark and reference for robotic cloth manipulation. Analysis revealed a significant discrepancy between competition performance and prior work, underscoring the importance of independent out-of-the-lab evaluation in robotic cloth manipulation. The resulting dataset, one of the most comprehensive collections of real-world robotic cloth manipulation data, is a valuable resource for developing and evaluating grasp selection methods, particularly for learning-based approaches. It can serve as a foundation for future benchmarks and drive further progress in data-driven robotic cloth manipulation.},
note = {Accepted},
keywords = {Authorship - Co-Author, JCR Q2, Jour - Int. J. Robot. Res. (IJRR)},
pubstate = {forthcoming},
tppubtype = {article}
}
Xiaobo Liu, Xudong Han, Wei Hong, Fang Wan, Chaoyang Song
Proprioceptive Learning with Soft Polyhedral Networks Journal Article
In: The International Journal of Robotics Research, vol. 43, no. 12, pp. 1916-1935, 2024.
Abstract | Links | BibTeX | Tags: Authorship - Corresponding, JCR Q2, Jour - Int. J. Robot. Res. (IJRR)
@article{Liu20242024ProprioceptiveLearning,
title = {Proprioceptive Learning with Soft Polyhedral Networks},
author = {Xiaobo Liu and Xudong Han and Wei Hong and Fang Wan and Chaoyang Song},
doi = {10.1177/02783649241238765},
year = {2024},
date = {2024-10-07},
urldate = {2024-10-07},
journal = {The International Journal of Robotics Research},
volume = {43},
number = {12},
pages = {1916-1935},
abstract = {Proprioception is the “sixth sense” that detects limb postures with motor neurons. It requires a natural integration between the musculoskeletal systems and sensory receptors, which is challenging among modern robots that aim for lightweight, adaptive, and sensitive designs at low costs in mechanical design and algorithmic computation. Here, we present the Soft Polyhedral Network with an embedded vision for physical interactions, capable of adaptive kinesthesia and viscoelastic proprioception by learning kinetic features. This design enables passive adaptations to omni-directional interactions, visually captured by a miniature high-speed motion-tracking system embedded inside for proprioceptive learning. The results show that the soft network can infer real-time 6D forces and torques with accuracies of 0.25/0.24/0.35 N and 0.025/0.034/0.006 Nm in dynamic interactions. We also incorporate viscoelasticity in proprioception during static adaptation by adding a creep and relaxation modifier to refine the predicted results. The proposed soft network combines simplicity in design, omni-adaptation, and proprioceptive sensing with high accuracy, making it a versatile solution for robotics at a low material cost with more than one million use cycles for tasks such as sensitive and competitive grasping and touch-based geometry reconstruction. This study offers new insights into vision-based proprioception for soft robots in adaptive grasping, soft manipulation, and human-robot interaction.},
keywords = {Authorship - Corresponding, JCR Q2, Jour - Int. J. Robot. Res. (IJRR)},
pubstate = {published},
tppubtype = {article}
}
Conference Papers
Sorry, no publications matched your criteria.
Conference Workshops & Extended Abstracts
Sorry, no publications matched your criteria.
Doctoral Thesis
Sorry, no publications matched your criteria.

