Human Robot Interaction
Human-robot interaction (HRI) research focuses on designing robots that can effectively and naturally interact with humans, aiming to improve collaboration, communication, and overall user experience. Current research emphasizes developing robots capable of understanding and responding to diverse human behaviors, including speech, gestures, and even physiological signals, often employing machine learning models like vision transformers, convolutional neural networks, and reinforcement learning algorithms to achieve this. These advancements are significant because they pave the way for safer, more intuitive, and productive human-robot collaborations across various domains, from industrial settings to assistive technologies and service robotics.
Papers
Evaluating Gesture Recognition in Virtual Reality
Sandeep Reddy Sabbella, Sara Kaszuba, Francesco Leotta, Pascal Serrarens, Daniele Nardi
Testing Human-Robot Interaction in Virtual Reality: Experience from a Study on Speech Act Classification
Sara Kaszuba, Sandeep Reddy Sabbella, Francesco Leotta, Pascal Serrarens, Daniele Nardi
Learning Multimodal Latent Dynamics for Human-Robot Interaction
Vignesh Prasad, Lea Heitlinger, Dorothea Koert, Ruth Stock-Homburg, Jan Peters, Georgia Chalvatzaki
Evaluating the Impact of Personalized Value Alignment in Human-Robot Interaction: Insights into Trust and Team Performance Outcomes
Shreyas Bhat, Joseph B. Lyons, Cong Shi, X. Jessie Yang