Simultaneous Localization and Mapping
Simultaneous Localization and Mapping (SLAM) is a fundamental robotics problem aiming to build a map of an unknown environment while simultaneously tracking the robot's location within that map. Current research emphasizes improving SLAM's robustness and efficiency across diverse sensor modalities (LiDAR, cameras, IMUs) and challenging environments, often employing techniques like graph-based optimization, neural implicit representations (e.g., Gaussian splatting, NeRFs), and deep learning for feature extraction and loop closure detection. Advances in SLAM are crucial for enabling autonomous navigation in various applications, from autonomous vehicles and robots in industrial settings to medical procedures and exploration of unstructured environments.
Papers
DRACo-SLAM: Distributed Robust Acoustic Communication-efficient SLAM for Imaging Sonar Equipped Underwater Robot Teams
John McConnell, Yewei Huang, Paul Szenher, Ivana Collado-Gonzalez, Brendan Englot
A Benchmark for Multi-Modal Lidar SLAM with Ground Truth in GNSS-Denied Environments
Ha Sier, Li Qingqing, Yu Xianjia, Jorge Peña Queralta, Zhuo Zou, Tomi Westerlund
MeSLAM: Memory Efficient SLAM based on Neural Fields
Evgenii Kruzhkov, Alena Savinykh, Pavel Karpyshev, Mikhail Kurenkov, Evgeny Yudin, Andrei Potapov, Dzmitry Tsetserukou
LMBAO: A Landmark Map for Bundle Adjustment Odometry in LiDAR SLAM
Letian Zhang, Jinping Wang, Lu Jie, Nanjie Chen, Xiaojun Tan, Zhifei Duan