Tactile Sensing
Tactile sensing aims to equip robots with a sense of touch, enabling them to perform complex manipulation tasks requiring fine motor skills and interaction with diverse objects. Current research focuses on developing robust and versatile tactile sensors, often integrating vision-based approaches for richer data, and employing machine learning techniques like convolutional neural networks, transformers, and Bayesian estimation for data processing and policy learning. This field is crucial for advancing robotics, particularly in areas like dexterous manipulation, human-robot collaboration, and applications requiring precise interaction with delicate or unstructured environments.
Papers
Attention for Robot Touch: Tactile Saliency Prediction for Robust Sim-to-Real Tactile Control
Yijiong Lin, Mauro Comi, Alex Church, Dandan Zhang, Nathan F. Lepora
Sim-to-Real Model-Based and Model-Free Deep Reinforcement Learning for Tactile Pushing
Max Yang, Yijiong Lin, Alex Church, John Lloyd, Dandan Zhang, David A. W. Barton, Nathan F. Lepora