Bird's Eye View Perception
Bird's-eye-view (BEV) perception aims to create a top-down, 2D representation of a 3D scene using camera and/or LiDAR data, primarily for autonomous driving applications. Current research focuses on improving the robustness and accuracy of BEV generation, often employing deep learning models like transformers and leveraging techniques such as multi-sensor fusion, and self-supervised learning to address challenges like data scarcity and sensor inconsistencies. This field is crucial for advancing autonomous driving capabilities by providing a unified and interpretable representation of the environment, enabling more reliable perception and decision-making.
Papers
November 16, 2024
September 26, 2024
September 16, 2024
July 21, 2024
September 20, 2023
August 24, 2023
July 4, 2023
June 1, 2023
May 15, 2023
April 4, 2023
January 29, 2023
September 12, 2022
August 19, 2022