Paper ID: 2410.04242
A Framework for Reproducible Benchmarking and Performance Diagnosis of SLAM Systems
Nikola Radulov (1), Yuhao Zhang (1), Mihai Bujanca (2), Ruiqi Ye (1), Mikel Luján (1) ((1) Department of Computer Science University of Manchester UK, (2) Qualcom Technologies XR Labs, Austria)
We propose SLAMFuse, an open-source SLAM benchmarking framework that provides consistent crossplatform environments for evaluating multi-modal SLAM algorithms, along with tools for data fuzzing, failure detection, and diagnosis across different datasets. Our framework introduces a fuzzing mechanism to test the resilience of SLAM algorithms against dataset perturbations. This enables the assessment of pose estimation accuracy under varying conditions and identifies critical perturbation thresholds. SLAMFuse improves diagnostics with failure detection and analysis tools, examining algorithm behaviour against dataset characteristics. SLAMFuse uses Docker to ensure reproducible testing conditions across diverse datasets and systems by streamlining dependency management. Emphasizing the importance of reproducibility and introducing advanced tools for algorithm evaluation and performance diagnosis, our work sets a new precedent for reliable benchmarking of SLAM systems. We provide ready-to-use docker compatible versions of the algorithms and datasets used in the experiments, together with guidelines for integrating and benchmarking new algorithms. Code is available at this https URL
Submitted: Oct 5, 2024