Physical Interaction
Physical interaction research focuses on enabling robots and AI systems to effectively and safely interact with their environments and humans, encompassing tasks from object manipulation to collaborative work. Current research emphasizes developing robust control methods, often incorporating machine learning models (e.g., graph neural networks, large language models) and advanced sensors (e.g., tactile skins) to improve perception, planning, and execution of physical interactions. This field is crucial for advancing robotics, human-robot collaboration, and AI safety, with applications ranging from assistive robotics and industrial automation to virtual and augmented reality.
Papers
Pose Priors from Language Models
Sanjay Subramanian, Evonne Ng, Lea Müller, Dan Klein, Shiry Ginosar, Trevor Darrell
CushSense: Soft, Stretchable, and Comfortable Tactile-Sensing Skin for Physical Human-Robot Interaction
Boxin Xu, Luoyan Zhong, Grace Zhang, Xiaoyu Liang, Diego Virtue, Rishabh Madan, Tapomayukh Bhattacharjee