Simultaneous Localization and Mapping
Simultaneous Localization and Mapping (SLAM) is a fundamental robotics problem aiming to build a map of an unknown environment while simultaneously tracking the robot's location within that map. Current research emphasizes improving SLAM's robustness and efficiency across diverse sensor modalities (LiDAR, cameras, IMUs) and challenging environments, often employing techniques like graph-based optimization, neural implicit representations (e.g., Gaussian splatting, NeRFs), and deep learning for feature extraction and loop closure detection. Advances in SLAM are crucial for enabling autonomous navigation in various applications, from autonomous vehicles and robots in industrial settings to medical procedures and exploration of unstructured environments.
Papers
MAC: Graph Sparsification by Maximizing Algebraic Connectivity
Kevin Doherty, Alan Papalia, Yewei Huang, David Rosen, Brendan Englot, John Leonard
GlORIE-SLAM: Globally Optimized RGB-only Implicit Encoding Point Cloud SLAM
Ganlin Zhang, Erik Sandström, Youmin Zhang, Manthan Patel, Luc Van Gool, Martin R. Oswald
Opti-Acoustic Semantic SLAM with Unknown Objects in Underwater Environments
Kurran Singh, Jungseok Hong, Nicholas R. Rypkema, John J. Leonard
High-Fidelity SLAM Using Gaussian Splatting with Rendering-Guided Densification and Regularized Optimization
Shuo Sun, Malcolm Mielle, Achim J. Lilienthal, Martin Magnusson
DVN-SLAM: Dynamic Visual Neural SLAM Based on Local-Global Encoding
Wenhua Wu, Guangming Wang, Ting Deng, Sebastian Aegidius, Stuart Shanks, Valerio Modugno, Dimitrios Kanoulas, Hesheng Wang
MCD: Diverse Large-Scale Multi-Campus Dataset for Robot Perception
Thien-Minh Nguyen, Shenghai Yuan, Thien Hoang Nguyen, Pengyu Yin, Haozhi Cao, Lihua Xie, Maciej Wozniak, Patric Jensfelt, Marko Thiel, Justin Ziegenbein, Noel Blunder