Paper ID: 2212.07754
Event-based Visual Tracking in Dynamic Environments
Irene Perez-Salesa, Rodrigo Aldana-Lopez, Carlos Sagues
Visual object tracking under challenging conditions of motion and light can be hindered by the capabilities of conventional cameras, prone to producing images with motion blur. Event cameras are novel sensors suited to robustly perform vision tasks under these conditions. However, due to the nature of their output, applying them to object detection and tracking is non-trivial. In this work, we propose a framework to take advantage of both event cameras and off-the-shelf deep learning for object tracking. We show that reconstructing event data into intensity frames improves the tracking performance in conditions under which conventional cameras fail to provide acceptable results.
Submitted: Dec 15, 2022