Perception Module
Perception modules are crucial components of autonomous systems, aiming to accurately interpret sensor data (visual, audio, depth, etc.) and provide reliable information for decision-making. Current research emphasizes improving the robustness and generalization of these modules, often employing deep learning architectures like transformers and convolutional neural networks, and exploring techniques like self-supervised learning and multi-modal fusion to enhance performance. This work is vital for advancing autonomous driving, robotics, and other applications requiring reliable real-time environmental understanding, particularly in addressing challenges like perception errors and sim-to-real transfer.
Papers
November 15, 2024
November 11, 2024
November 8, 2024
October 11, 2024
September 2, 2024
July 31, 2024
July 23, 2024
July 18, 2024
July 17, 2024
May 13, 2024
April 24, 2024
March 18, 2024
March 15, 2024
February 23, 2024
January 18, 2024
January 9, 2024
December 22, 2023
October 23, 2023
October 19, 2023
September 28, 2023