Simultaneous Localization and Mapping
Simultaneous Localization and Mapping (SLAM) is a fundamental robotics problem aiming to build a map of an unknown environment while simultaneously tracking the robot's location within that map. Current research emphasizes improving SLAM's robustness and efficiency across diverse sensor modalities (LiDAR, cameras, IMUs) and challenging environments, often employing techniques like graph-based optimization, neural implicit representations (e.g., Gaussian splatting, NeRFs), and deep learning for feature extraction and loop closure detection. Advances in SLAM are crucial for enabling autonomous navigation in various applications, from autonomous vehicles and robots in industrial settings to medical procedures and exploration of unstructured environments.
Papers
Customizable Perturbation Synthesis for Robust SLAM Benchmarking
Xiaohao Xu, Tianyi Zhang, Sibo Wang, Xiang Li, Yongqi Chen, Ye Li, Bhiksha Raj, Matthew Johnson-Roberson, Xiaonan Huang
UAV-assisted Visual SLAM Generating Reconstructed 3D Scene Graphs in GPS-denied Environments
Ahmed Radwan, Ali Tourani, Hriday Bavle, Holger Voos, Jose Luis Sanchez-Lopez
Particle Filter SLAM for Vehicle Localization
Tianrui Liu, Changxin Xu, Yuxin Qiao, Chufeng Jiang, Jiqiang Yu