Robot Person
Robot person research focuses on creating robots capable of interacting naturally and effectively with humans, encompassing tasks from simple navigation to complex manipulation and social interaction. Current research emphasizes developing robust control algorithms (like Kalman filters and Model Predictive Control), integrating advanced perception models (including Vision-Language Models and sensor fusion), and improving human-robot interaction through multimodal communication and shared autonomy. This field is significant for advancing robotics capabilities in various sectors, including healthcare, manufacturing, and service industries, by enabling robots to perform tasks more safely, efficiently, and intuitively alongside humans.
Papers
Yell At Your Robot: Improving On-the-Fly from Language Corrections
Lucy Xiaoyang Shi, Zheyuan Hu, Tony Z. Zhao, Archit Sharma, Karl Pertsch, Jianlan Luo, Sergey Levine, Chelsea Finn
BTGenBot: Behavior Tree Generation for Robotic Tasks with Lightweight LLMs
Riccardo Andrea Izzo, Gianluca Bardaro, Matteo Matteucci
To Help or Not to Help: LLM-based Attentive Support for Human-Robot Group Interactions
Daniel Tanneberg, Felix Ocker, Stephan Hasler, Joerg Deigmoeller, Anna Belardinelli, Chao Wang, Heiko Wersing, Bernhard Sendhoff, Michael Gienger
Aligning Learning with Communication in Shared Autonomy
Joshua Hoegerman, Shahabedin Sagheb, Benjamin A. Christie, Dylan P. Losey
Hardware Design and Learning-Based Software Architecture of Musculoskeletal Wheeled Robot Musashi-W for Real-World Applications
Kento Kawaharazuka, Akihiro Miki, Masahiro Bando, Temma Suzuki, Yoshimoto Ribayashi, Yasunori Toshimitsu, Yuya Nagamatsu, Kei Okada, and Masayuki Inaba
Are you a robot? Detecting Autonomous Vehicles from Behavior Analysis
Fabio Maresca, Filippo Grazioli, Antonio Albanese, Vincenzo Sciancalepore, Gianpiero Negri, Xavier Costa-Perez
Enabling Waypoint Generation for Collaborative Robots using LLMs and Mixed Reality
Cathy Mengying Fang, Krzysztof Zieliński, Pattie Maes, Joe Paradiso, Bruce Blumberg, Mikkel Baun Kjærgaard
BEHAVIOR-1K: A Human-Centered, Embodied AI Benchmark with 1,000 Everyday Activities and Realistic Simulation
Chengshu Li, Ruohan Zhang, Josiah Wong, Cem Gokmen, Sanjana Srivastava, Roberto Martín-Martín, Chen Wang, Gabrael Levine, Wensi Ai, Benjamin Martinez, Hang Yin, Michael Lingelbach, Minjune Hwang, Ayano Hiranaka, Sujay Garlanka, Arman Aydin, Sharon Lee, Jiankai Sun, Mona Anvari, Manasi Sharma, Dhruva Bansal, Samuel Hunter, Kyu-Young Kim, Alan Lou, Caleb R Matthews, Ivan Villa-Renteria, Jerry Huayang Tang, Claire Tang, Fei Xia, Yunzhu Li, Silvio Savarese, Hyowon Gweon, C. Karen Liu, Jiajun Wu, Li Fei-Fei
Personalizing Interfaces to Humans with User-Friendly Priors
Benjamin A. Christie, Heramb Nemlekar, Dylan P. Losey
Data-driven architecture to encode information in the kinematics of robots and artificial avatars
Francesco De Lellis, Marco Coraggio, Nathan C. Foster, Riccardo Villa, Cristina Becchio, Mario di Bernardo
Hefty: A Modular Reconfigurable Robot for Advancing Robot Manipulation in Agriculture
Dominic Guri, Moonyoung Lee, Oliver Kroemer, George Kantor
Robot Body Schema Learning from Full-body Extero/Proprioception Sensors
Shuo Jiang, Jinkun Zhang, Lawson Wong
Equivalent Environments and Covering Spaces for Robots
Vadim K. Weinstein, Steven M. LaValle