Paper ID: 2404.19585

Integrating and Evaluating Visuo-tactile Sensing with Haptic Feedback for Teleoperated Robot Manipulation

Noah Becker, Kyrylo Sovailo, Chunyao Zhu, Erik Gattung, Kay Hansel, Tim Schneider, Yaonan Zhu, Yasuhisa Hasegawa, Jan Peters

Telerobotics enables humans to overcome spatial constraints and physically interact with the environment in remote locations. However, the sensory feedback provided by the system to the user is often purely visual, limiting the user's dexterity in manipulation tasks. This work addresses this issue by equipping the robot's end-effector with high-resolution visuotactile GelSight sensors. Using low-cost MANUS-Gloves, we provide the user with haptic feedback about forces acting at the points of contact in the form of vibration signals. We employ two different methods for estimating these forces; one based on estimating the movement of markers on the sensor surface and one deep-learning approach. Additionally, we integrate our system into a virtual-reality teleoperation pipeline in which a human user controls both arms of a Tiago robot while receiving visual and haptic feedback. Lastly, we present a novel setup to evaluate normal force, shear force, and slip. We believe that integrating haptic feedback is a crucial step towards dexterous manipulation in teleoperated robotic systems.

Submitted: Apr 30, 2024