Interactive Perception
Interactive perception focuses on enabling robots to actively gather sensory information through interaction with their environment, improving object understanding and manipulation capabilities. Current research emphasizes developing methods for handling articulated and deformable objects, often employing closed-loop control systems and integrating multiple sensor modalities (vision, touch, sound) within unified frameworks, sometimes leveraging neural networks and self-supervised learning. This research is significant for advancing robotics, particularly in areas like autonomous driving, surgical robotics, and domestic tasks, by enabling more robust and adaptable robot behaviors in complex, unstructured environments.
Papers
September 24, 2024
March 8, 2024
January 21, 2024
November 16, 2023
September 15, 2023
August 25, 2023
July 13, 2023
May 5, 2023
May 1, 2023
March 29, 2023
February 2, 2023
September 27, 2022
September 14, 2022
June 24, 2022