Human Robot Interaction
Human-robot interaction (HRI) research focuses on designing robots that can effectively and naturally interact with humans, aiming to improve collaboration, communication, and overall user experience. Current research emphasizes developing robots capable of understanding and responding to diverse human behaviors, including speech, gestures, and even physiological signals, often employing machine learning models like vision transformers, convolutional neural networks, and reinforcement learning algorithms to achieve this. These advancements are significant because they pave the way for safer, more intuitive, and productive human-robot collaborations across various domains, from industrial settings to assistive technologies and service robotics.
Papers
Whither the Priors for (Vocal) Interactivity?
Roger K. Moore
Towards Formalizing HRI Data Collection Processes
Zhao Han, Tom Williams
Design and Evaluation of an Augmented Reality Head-Mounted Display Interface for Human Robot Teams Collaborating in Physically Shared Manufacturing Tasks
Wesley P Chan, Geoffrey Hanks, Maram Sakr, Haomiao Zhang, Tiger Zuo, H F Machiel Van der Loos, Elizabeth Croft
The role of haptic communication in dyadic collaborative object manipulation tasks
Yiming Liu, Raz Leib, William Dudley, Ali Shafti, A. Aldo Faisal, David W. Franklin
Container Localisation and Mass Estimation with an RGB-D Camera
Tommaso Apicella, Giulia Slavic, Edoardo Ragusa, Paolo Gastaldo, Lucio Marcenaro
Commonsense Reasoning for Identifying and Understanding the Implicit Need of Help and Synthesizing Assistive Actions
Maëlic Neau, Paulo Santos, Anne-Gwenn Bosser, Nathan Beu, Cédric Buche
Trajectory planning in Dynamics Environment : Application for Haptic Perception in Safe HumanRobot Interaction
A Gutierrez, V Guda, S Mugisha, C Chevallereau, Damien Chablat
Virtual, Augmented, and Mixed Reality for Human-Robot Interaction: A Survey and Virtual Design Element Taxonomy
Michael Walker, Thao Phung, Tathagata Chakraborti, Tom Williams, Daniel Szafir