Event Based Camera
Event-based cameras, inspired by the biological retina, offer high temporal resolution and low power consumption by asynchronously recording changes in brightness, unlike traditional frame-based cameras. Current research focuses on developing algorithms and architectures, including spiking neural networks and transformers, for tasks such as object detection, semantic segmentation, and visual odometry, often integrating event data with frame-based information for improved performance. This technology's unique capabilities are driving advancements in robotics, autonomous driving, and other applications requiring real-time vision in dynamic or low-light conditions, particularly where energy efficiency is critical.
Papers
Optical flow estimation from event-based cameras and spiking neural networks
Javier Cuadrado, Ulysse Rançon, Benoît Cottereau, Francisco Barranco, Timothée Masquelier
A Neuromorphic Dataset for Object Segmentation in Indoor Cluttered Environment
Xiaoqian Huang, Kachole Sanket, Abdulla Ayyad, Fariborz Baghaei Naeini, Dimitrios Makris, Yahya Zweiri