Robotic Manipulation
Robotic manipulation research focuses on enabling robots to dexterously interact with their environment, achieving complex tasks through precise movements and object manipulation. Current efforts concentrate on improving the robustness and generalization of manipulation policies, often leveraging vision-language models, transformer architectures, and reinforcement learning techniques to enable robots to understand and respond to diverse instructions and environments. This field is crucial for advancing automation in various sectors, from manufacturing and logistics to healthcare and agriculture, by creating more adaptable and reliable robotic systems capable of handling a wider range of tasks. Furthermore, significant attention is being paid to developing more efficient data collection methods and improving the safety and reliability of these systems.
Papers
FoAM: Foresight-Augmented Multi-Task Imitation Policy for Robotic Manipulation
Litao Liu, Wentao Wang, Yifan Han, Zhuoli Xie, Pengfei Yi, Junyan Li, Yi Qin, Wenzhao Lian
Fast-UMI: A Scalable and Hardware-Independent Universal Manipulation Interface
Ziniu Wu, Tianyu Wang, Zhaxizhuoma, Chuyue Guan, Zhongjie Jia, Shuai Liang, Haoming Song, Delin Qu, Dong Wang, Zhigang Wang, Nieqing Cao, Yan Ding, Bin Zhao, Xuelong Li
Towards Testing and Evaluating Vision-Language-Action Models for Robotic Manipulation: An Empirical Study
Zhijie Wang, Zhehua Zhou, Jiayang Song, Yuheng Huang, Zhan Shu, Lei Ma
Optimal Cosserat-based deformation control for robotic manipulation of linear objects
Azad Artinian, Faiz Ben Amar, Veronique Perdereau
TinyVLA: Towards Fast, Data-Efficient Vision-Language-Action Models for Robotic Manipulation
Junjie Wen, Yichen Zhu, Jinming Li, Minjie Zhu, Kun Wu, Zhiyuan Xu, Ran Cheng, Chaomin Shen, Yaxin Peng, Feifei Feng, Jian Tang
Shape-Space Deformer: Unified Visuo-Tactile Representations for Robotic Manipulation of Deformable Objects
Sean M. V. Collins, Brendan Tidd, Mahsa Baktashmotlagh, Peyman Moghadam
Closed-Loop Visuomotor Control with Generative Expectation for Robotic Manipulation
Qingwen Bu, Jia Zeng, Li Chen, Yanchao Yang, Guyue Zhou, Junchi Yan, Ping Luo, Heming Cui, Yi Ma, Hongyang Li
ClearDepth: Enhanced Stereo Perception of Transparent Objects for Robotic Manipulation
Kaixin Bai, Huajian Zeng, Lei Zhang, Yiwen Liu, Hongli Xu, Zhaopeng Chen, Jianwei Zhang