Event Camera
Event cameras are bio-inspired sensors that asynchronously detect changes in light intensity, offering advantages over traditional cameras in high-speed, low-light, and high-dynamic-range scenarios. Current research focuses on developing algorithms and models, including neural networks (e.g., transformers, convolutional neural networks, and spiking neural networks), for tasks such as 3D reconstruction, object tracking, and depth estimation using event data, often integrating event streams with frame-based data for improved performance. This technology holds significant promise for applications in robotics, autonomous driving, and other fields requiring robust and efficient visual perception in challenging environments. The development of new datasets and improved event data augmentation techniques are also key areas of ongoing research.
Papers
Tracking Any Point with Frame-Event Fusion Network at High Frame Rate
Jiaxiong Liu, Bo Wang, Zhen Tan, Jinpu Zhang, Hui Shen, Dewen Hu
EventAug: Multifaceted Spatio-Temporal Data Augmentation Methods for Event-based Learning
Yukun Tian, Hao Chen, Yongjian Deng, Feihong Shen, Kepan Liu, Wei You, Ziyang Zhang