Tactile Modality
Tactile modality research focuses on equipping robots with a sense of touch, aiming to improve dexterity, manipulation skills, and human-robot interaction. Current efforts concentrate on developing advanced tactile sensors, often integrated with vision or other modalities, and employing machine learning techniques like deep neural networks, Kalman filters, and LSTM networks for data processing and control. This research is crucial for advancing robotics in areas such as surgery, manufacturing, and assistive technologies, enabling robots to perform complex tasks requiring fine motor control and interaction with diverse materials and environments. The development of more realistic and robust tactile simulations is also a key area of focus, bridging the gap between simulated and real-world performance.
Papers
NUSense: Robust Soft Optical Tactile Sensor
Madina Yergibay, Tleukhan Mussin, Saltanat Seitzhan, Daryn Kenzhebek, Zhanat Kappassov, Harold Soh, Tasbolat Taunyazov
MiniTac: An Ultra-Compact 8 mm Vision-Based Tactile Sensor for Enhanced Palpation in Robot-Assisted Minimally Invasive Surgery
Wanlin Li, Zihang Zhao, Leiyao Cui, Weiyi Zhang, Hangxin Liu, Li-An Li, Yixin Zhu