Human Robot Interaction
Human-robot interaction (HRI) research focuses on designing robots that can effectively and naturally interact with humans, aiming to improve collaboration, communication, and overall user experience. Current research emphasizes developing robots capable of understanding and responding to diverse human behaviors, including speech, gestures, and even physiological signals, often employing machine learning models like vision transformers, convolutional neural networks, and reinforcement learning algorithms to achieve this. These advancements are significant because they pave the way for safer, more intuitive, and productive human-robot collaborations across various domains, from industrial settings to assistive technologies and service robotics.
Papers
Testing Human-Hand Segmentation on In-Distribution and Out-of-Distribution Data in Human-Robot Interactions Using a Deep Ensemble Model
Reza Jalayer, Yuxin Chen, Masoud Jalayer, Carlotta Orsenigo, Masayoshi Tomizuka
Inductive Learning of Robot Task Knowledge from Raw Data and Online Expert Feedback
Daniele Meli, Paolo Fiorini
GestLLM: Advanced Hand Gesture Interpretation via Large Language Models for Human-Robot Interaction
Oleg Kobzarev, Artem Lykov, Dzmitry Tsetserukou
Multi-face emotion detection for effective Human-Robot Interaction
Mohamed Ala Yahyaoui, Mouaad Oujabour, Leila Ben Letaifa, Amine Bohi
ROSAnnotator: A Web Application for ROSBag Data Analysis in Human-Robot Interaction
Yan Zhang, Haoqi Li, Ramtin Tabatabaei, Wafa Johal
THÖR-MAGNI Act: Actions for Human Motion Modeling in Robot-Shared Industrial Spaces
Tiago Rodrigues de Almeida, Tim Schreiter, Andrey Rudenko, Luigi Palmieiri, Johannes A. Stork, Achim J. Lilienthal
Unified Understanding of Environment, Task, and Human for Human-Robot Interaction in Real-World Environments
Yuga Yano, Akinobu Mizutani, Yukiya Fukuda, Daiju Kanaoka, Tomohiro Ono, Hakaru Tamukoh