Eye View Perception
Eye view perception, specifically the creation of bird's-eye-view (BEV) representations of the environment from various sensor inputs (primarily cameras, but also lidar and radar), is a crucial area of research in autonomous driving. Current efforts focus on developing robust and efficient BEV perception models, often employing deep learning architectures that leverage multi-view image features and incorporate techniques like forward and backward projections, as well as generative models to improve map layout estimation and handle uncertainty. These advancements aim to improve the accuracy, speed, and generalizability of autonomous vehicle perception, ultimately leading to safer and more reliable self-driving systems.
Papers
July 17, 2024
July 11, 2024
August 24, 2023
August 4, 2023
January 19, 2023
November 30, 2022