Liquid Perception
Liquid perception in robotics focuses on enabling robots to understand and interact with liquids, a crucial capability for tasks like pouring and autonomous navigation on water. Current research emphasizes developing robust methods for liquid detection and property estimation using diverse sensor modalities, including vision (often incorporating deep learning models like large vision-language models and convolutional neural networks), tactile sensing, and radar, often combined through data fusion techniques. These advancements are driven by the need for improved accuracy and efficiency in robotic manipulation and autonomous systems, with applications ranging from industrial automation to assistive technologies.
Papers
April 10, 2024
July 21, 2023
July 14, 2023