Tactile Sensing
Tactile sensing aims to equip robots with a sense of touch, enabling them to perform complex manipulation tasks requiring fine motor skills and interaction with diverse objects. Current research focuses on developing robust and versatile tactile sensors, often integrating vision-based approaches for richer data, and employing machine learning techniques like convolutional neural networks, transformers, and Bayesian estimation for data processing and policy learning. This field is crucial for advancing robotics, particularly in areas like dexterous manipulation, human-robot collaboration, and applications requiring precise interaction with delicate or unstructured environments.
Papers
TacSL: A Library for Visuotactile Sensor Simulation and Learning
Iretiayo Akinola, Jie Xu, Jan Carius, Dieter Fox, Yashraj Narang
EyeSight Hand: Design of a Fully-Actuated Dexterous Robot Hand with Integrated Vision-Based Tactile Sensors and Compliant Actuation
Branden Romero, Hao-Shu Fang, Pulkit Agrawal, Edward Adelson