Event Camera
Event cameras are bio-inspired sensors that asynchronously detect changes in light intensity, offering advantages over traditional cameras in high-speed, low-light, and high-dynamic-range scenarios. Current research focuses on developing algorithms and models, including neural networks (e.g., transformers, convolutional neural networks, and spiking neural networks), for tasks such as 3D reconstruction, object tracking, and depth estimation using event data, often integrating event streams with frame-based data for improved performance. This technology holds significant promise for applications in robotics, autonomous driving, and other fields requiring robust and efficient visual perception in challenging environments. The development of new datasets and improved event data augmentation techniques are also key areas of ongoing research.
Papers
CodedEvents: Optimal Point-Spread-Function Engineering for 3D-Tracking with Event Cameras
Sachin Shah, Matthew Albert Chan, Haoming Cai, Jingxi Chen, Sakshum Kulshrestha, Chahat Deep Singh, Yiannis Aloimonos, Christopher Metzler
A Label-Free and Non-Monotonic Metric for Evaluating Denoising in Event Cameras
Chenyang Shi, Shasha Guo, Boyi Wei, Hanxiao Liu, Yibo Zhang, Ningfang Song, Jing Jin
OpenESS: Event-based Semantic Scene Understanding with Open Vocabularies
Lingdong Kong, Youquan Liu, Lai Xing Ng, Benoit R. Cottereau, Wei Tsang Ooi
TENet: Targetness Entanglement Incorporating with Multi-Scale Pooling and Mutually-Guided Fusion for RGB-E Object Tracking
Pengcheng Shao, Tianyang Xu, Zhangyong Tang, Linze Li, Xiao-Jun Wu, Josef Kittler