Intuitive Human Robot
Intuitive human-robot interaction (HRI) research aims to create robots that seamlessly understand and respond to human commands and cues, moving beyond complex programming to more natural communication. Current efforts focus on integrating large language models (LLMs) and natural language processing (NLP) for verbal instruction, leveraging computer vision for gesture recognition (including 3D pose estimation and tactile sensing), and incorporating machine learning techniques like imitation learning to improve robot navigation and task execution based on human demonstrations. This work holds significant potential for advancing robotics in various fields, from industrial automation and collaborative robotics to assistive technologies and human-robot teamwork in challenging environments.
Papers
Deployment of Large Language Models to Control Mobile Robots at the Edge
Pascal Sikorski, Leendert Schrader, Kaleb Yu, Lucy Billadeau, Jinka Meenakshi, Naveena Mutharasan, Flavio Esposito, Hadi AliAkbarpour, Madi Babaiasl
Advancements in Tactile Hand Gesture Recognition for Enhanced Human-Machine Interaction
Chiara Fumelli, Anirvan Dutta, Mohsen Kaboli