Neuromorphic Event
Neuromorphic event-based vision leverages asynchronous sensors mimicking biological retinas to capture changes in light intensity, offering advantages in speed, power efficiency, and dynamic range compared to traditional frame-based cameras. Current research focuses on developing algorithms and models, including spiking neural networks (SNNs) and transformers, to process these event streams for tasks like object detection, tracking, and 3D reconstruction, often integrating event data with complementary RGB or LiDAR information. This field is significant for its potential to enable low-power, high-performance vision systems in applications ranging from autonomous driving and robotics to human-computer interaction and biomedical sensing.
Papers
March 25, 2024
March 20, 2024
February 20, 2024
January 10, 2024
January 5, 2024
January 2, 2024
October 5, 2023
September 29, 2023
September 27, 2023
September 21, 2023
August 8, 2023
July 21, 2023
July 5, 2023
April 18, 2023
March 13, 2023
January 17, 2023
October 27, 2022
January 5, 2022
December 20, 2021