Human Robot Interaction
Human-robot interaction (HRI) research focuses on designing robots that can effectively and naturally interact with humans, aiming to improve collaboration, communication, and overall user experience. Current research emphasizes developing robots capable of understanding and responding to diverse human behaviors, including speech, gestures, and even physiological signals, often employing machine learning models like vision transformers, convolutional neural networks, and reinforcement learning algorithms to achieve this. These advancements are significant because they pave the way for safer, more intuitive, and productive human-robot collaborations across various domains, from industrial settings to assistive technologies and service robotics.
Papers
SiSCo: Signal Synthesis for Effective Human-Robot Communication Via Large Language Models
Shubham Sonawani, Fabian Weigend, Heni Ben Amor
Personalization in Human-Robot Interaction through Preference-based Action Representation Learning
Ruiqi Wang, Dezhong Zhao, Dayoon Suh, Ziqin Yuan, Guohua Chen, Byung-Cheol Min
Personalized Speech Emotion Recognition in Human-Robot Interaction using Vision Transformers
Ruchik Mishra, Andrew Frye, Madan Mohan Rayguru, Dan O. Popa
Highly dynamic physical interaction for robotics: design and control of an active remote center of compliance
Christian Friedrich, Patrick Frank, Marco Santin, Matthias Haag
Advancements in Gesture Recognition Techniques and Machine Learning for Enhanced Human-Robot Interaction: A Comprehensive Review
Sajjad Hussain, Khizer Saeed, Almas Baimagambetov, Shanay Rab, Md Saad
Adaptive Electronic Skin Sensitivity for Safe Human-Robot Interaction
Lukas Rustler, Matej Misar, Matej Hoffmann
The Influence of Demographic Variation on the Perception of Industrial Robot Movements
Damian Hostettler
Using vs. Purchasing Industrial Robots: Adding an Organizational Perspective to Industrial HRI
Damian Hostettler
Gesture Generation from Trimodal Context for Humanoid Robots
Shiyi Tang, Christian Dondrup
Enhancing Socially-Aware Robot Navigation through Bidirectional Natural Language Conversation
Congcong Wen, Yifan Liu, Geeta Chandra Raju Bethala, Zheng Peng, Hui Lin, Yu-Shen Liu, Yi Fang
FlowAct: A Proactive Multimodal Human-robot Interaction System with Continuous Flow of Perception and Modular Action Sub-systems
Timothée Dhaussy, Bassam Jabaian, Fabrice Lefèvre
Benchmarking ML Approaches to UWB-Based Range-Only Posture Recognition for Human Robot-Interaction
Salma Salimi, Sahar Salimpour, Jorge Peña Queralta, Wallace Moreira Bessa, Tomi Westerlund