Eye View
Bird's-Eye-View (BEV) perception aims to create a top-down representation of a scene from multiple camera images, mimicking a helicopter view, crucial for autonomous driving and robotics. Current research focuses on improving BEV generation accuracy and robustness using various deep learning architectures, including transformers and attention mechanisms, often incorporating sensor fusion (e.g., lidar and camera) and addressing challenges like occlusion and varying camera viewpoints. This work is significant because accurate and reliable BEV representations are essential for safe and efficient navigation in autonomous systems, impacting the development of self-driving cars and other robotic applications.
Papers
November 15, 2024
November 11, 2024
October 21, 2024
October 18, 2024
October 8, 2024
September 26, 2024
September 20, 2024
September 18, 2024
September 16, 2024
September 3, 2024
August 29, 2024
July 18, 2024
July 11, 2024
June 4, 2024
May 31, 2024
May 23, 2024
May 17, 2024
May 14, 2024
May 2, 2024