Simultaneous Localization
Simultaneous Localization and Mapping (SLAM) aims to build a map of an unknown environment while simultaneously determining the location of a robot or sensor within that map. Current research emphasizes robust and efficient algorithms, including those based on iterative closest point (ICP), extended Kalman filters, bundle adjustment, and neural implicit representations like NeRFs and Gaussian splatting, often incorporating sensor fusion from LiDAR, IMUs, cameras, and even WiFi signals. These advancements are crucial for enabling autonomous navigation in diverse environments, from indoor spaces and construction sites to outdoor terrains and even planetary exploration, with applications ranging from robotics and augmented reality to autonomous driving and infrastructure inspection.
Papers
OASIS: Optimal Arrangements for Sensing in SLAM
Pushyami Kaveti, Matthew Giamou, Hanumant Singh, David M. Rosen
Vision-based Situational Graphs Exploiting Fiducial Markers for the Integration of Semantic Entities
Ali Tourani, Hriday Bavle, Jose Luis Sanchez-Lopez, Deniz Isinsu Avsar, Rafael Munoz Salinas, Holger Voos
A Robust and Rapidly Deployable Waypoint Navigation Architecture for Long-Duration Operations in GPS-Denied Environments
Erik Pearson, Brendan Englot
Detecting and Classifying Bio-Inspired Artificial Landmarks Using In-Air 3D Sonar
Maarten de Backer, Wouter Jansen, Dennis Laurijssen, Ralph Simon, Walter Daems, Jan Steckel