Simultaneous Localization
Simultaneous Localization and Mapping (SLAM) aims to build a map of an unknown environment while simultaneously determining the location of a robot or sensor within that map. Current research emphasizes robust and efficient algorithms, including those based on iterative closest point (ICP), extended Kalman filters, bundle adjustment, and neural implicit representations like NeRFs and Gaussian splatting, often incorporating sensor fusion from LiDAR, IMUs, cameras, and even WiFi signals. These advancements are crucial for enabling autonomous navigation in diverse environments, from indoor spaces and construction sites to outdoor terrains and even planetary exploration, with applications ranging from robotics and augmented reality to autonomous driving and infrastructure inspection.
Papers
Visual SLAM: What are the Current Trends and What to Expect?
Ali Tourani, Hriday Bavle, Jose Luis Sanchez-Lopez, Holger Voos
Comparison of Varied 2D Mapping Approaches by Using Practice-Oriented Evaluation Criteria
Justin Ziegenbein, Manuel Schrick, Marko Thiel, Johannes Hinckeldeyn, Jochen Kreutzfeldt