Sensor Fusion
Sensor fusion integrates data from multiple sensors to enhance the accuracy, robustness, and reliability of perception systems. Current research emphasizes developing efficient and robust fusion algorithms, often employing deep learning architectures like convolutional neural networks (CNNs) and transformers, as well as Kalman filters and other probabilistic methods, to handle diverse sensor modalities (e.g., camera, LiDAR, radar, inertial sensors) and address challenges like sensor misalignment and label uncertainty. This field is crucial for advancing autonomous vehicles, robotics, and other applications requiring accurate and reliable real-time environmental understanding.
Papers
May 7, 2024
April 26, 2024
April 24, 2024
April 9, 2024
April 2, 2024
March 30, 2024
March 7, 2024
February 28, 2024
February 22, 2024
February 21, 2024
February 7, 2024
February 5, 2024
February 1, 2024
January 30, 2024
January 16, 2024
January 11, 2024
December 14, 2023
December 1, 2023
November 25, 2023
November 17, 2023