Intuitive Human Robot

Intuitive human-robot interaction (HRI) research aims to create robots that seamlessly understand and respond to human commands and cues, moving beyond complex programming to more natural communication. Current efforts focus on integrating large language models (LLMs) and natural language processing (NLP) for verbal instruction, leveraging computer vision for gesture recognition (including 3D pose estimation and tactile sensing), and incorporating machine learning techniques like imitation learning to improve robot navigation and task execution based on human demonstrations. This work holds significant potential for advancing robotics in various fields, from industrial automation and collaborative robotics to assistive technologies and human-robot teamwork in challenging environments.

Papers