Tactile Sensing
Tactile sensing aims to equip robots with a sense of touch, enabling them to perform complex manipulation tasks requiring fine motor skills and interaction with diverse objects. Current research focuses on developing robust and versatile tactile sensors, often integrating vision-based approaches for richer data, and employing machine learning techniques like convolutional neural networks, transformers, and Bayesian estimation for data processing and policy learning. This field is crucial for advancing robotics, particularly in areas like dexterous manipulation, human-robot collaboration, and applications requiring precise interaction with delicate or unstructured environments.
Papers
Braille Letter Reading: A Benchmark for Spatio-Temporal Pattern Recognition on Neuromorphic Hardware
Simon F Muller-Cleve, Vittorio Fra, Lyes Khacef, Alejandro Pequeno-Zurro, Daniel Klepatsch, Evelina Forno, Diego G Ivanovich, Shavika Rastogi, Gianvito Urgese, Friedemann Zenke, Chiara Bartolozzi
Play it by Ear: Learning Skills amidst Occlusion through Audio-Visual Imitation Learning
Maximilian Du, Olivia Y. Lee, Suraj Nair, Chelsea Finn