Event Based Stereo
Event-based stereo aims to reconstruct 3D scenes using the asynchronous, high-temporal-resolution data from pairs of event cameras, offering advantages over traditional frame-based stereo vision. Current research focuses on improving disparity estimation accuracy through novel algorithms that leverage event streams' temporal information, often incorporating motion cues and integrating data from other sensors like LiDAR or inertial measurement units to address challenges like sparse event occurrences. These advancements are significant for robotics and autonomous systems, enabling robust depth perception in dynamic and low-light conditions, and are driving the development of new benchmark datasets for evaluating these methods.
Papers
October 12, 2024
September 9, 2024
August 10, 2024
August 8, 2024
March 9, 2023
November 1, 2022
July 21, 2022