Simultaneous Localization and Mapping
Simultaneous Localization and Mapping (SLAM) is a fundamental robotics problem aiming to build a map of an unknown environment while simultaneously tracking the robot's location within that map. Current research emphasizes improving SLAM's robustness and efficiency across diverse sensor modalities (LiDAR, cameras, IMUs) and challenging environments, often employing techniques like graph-based optimization, neural implicit representations (e.g., Gaussian splatting, NeRFs), and deep learning for feature extraction and loop closure detection. Advances in SLAM are crucial for enabling autonomous navigation in various applications, from autonomous vehicles and robots in industrial settings to medical procedures and exploration of unstructured environments.
Papers
SubT-MRS Dataset: Pushing SLAM Towards All-weather Environments
Shibo Zhao, Yuanjun Gao, Tianhao Wu, Damanpreet Singh, Rushan Jiang, Haoxiang Sun, Mansi Sarawata, Yuheng Qiu, Warren Whittaker, Ian Higgins, Yi Du, Shaoshu Su, Can Xu, John Keller, Jay Karhade, Lucas Nogueira, Sourojit Saha, Ji Zhang, Wenshan Wang, Chen Wang, Sebastian Scherer
Reinforcement Learning with Frontier-Based Exploration via Autonomous Environment
Kenji Leong
Multi S-graphs: A Collaborative Semantic SLAM architecture
Miguel Fernandez-Cortizas, Hriday Bavle, Jose Luis Sanchez-Lopez, Pascual Campoy, Holger Voos
Set-Type Belief Propagation with Applications to Poisson Multi-Bernoulli SLAM
Hyowon Kim, Angel F. García-Fernández, Yu Ge, Yuxuan Xia, Lennart Svensson, Henk Wymeersch