Paper ID: 2406.16875

Multi-Stage Fusion Architecture for Small-Drone Localization and Identification Using Passive RF and EO Imagery: A Case Study

Thakshila Wimalajeewa Wewelwala, Thomas W. Tedesso, Tony Davis

Reliable detection, localization and identification of small drones is essential to promote safe, secure and privacy-respecting operation of Unmanned-Aerial Systems (UAS), or simply, drones. This is an increasingly challenging problem with only single modality sensing, especially, to detect and identify small drones. In this work, a multi-stage fusion architecture using passive radio frequency (RF) and electro-optic (EO) imagery data is developed to leverage the synergies of the modalities to improve the overall tracking and classification capabilities. For detection with EO-imagery, supervised deep learning based techniques as well as unsupervised foreground/background separation techniques are explored to cope with challenging environments. Using real collected data for Group 1 and 2 drones, the capability of each algorithm is quantified. In order to compensate for any performance gaps in detection with only EO imagery as well as to provide a unique device identifier for the drones, passive RF is integrated with EO imagery whenever available. In particular, drone detections in the image plane are combined with passive RF location estimates via detection-to-detection association after 3D to 2D transformation. Final tracking is performed on the composite detections in the 2D image plane. Each track centroid is given a unique identification obtained via RF fingerprinting. The proposed fusion architecture is tested and the tracking and performance is quantified over the range to illustrate the effectiveness of the proposed approaches using simultaneously collected passive RF and EO data at the Air Force Research Laboratory (AFRL) through ESCAPE-21 (Experiments, Scenarios, Concept of Operations, and Prototype Engineering) data collect

Submitted: Mar 30, 2024