Simultaneous Localization
Simultaneous Localization and Mapping (SLAM) aims to build a map of an unknown environment while simultaneously determining the location of a robot or sensor within that map. Current research emphasizes robust and efficient algorithms, including those based on iterative closest point (ICP), extended Kalman filters, bundle adjustment, and neural implicit representations like NeRFs and Gaussian splatting, often incorporating sensor fusion from LiDAR, IMUs, cameras, and even WiFi signals. These advancements are crucial for enabling autonomous navigation in diverse environments, from indoor spaces and construction sites to outdoor terrains and even planetary exploration, with applications ranging from robotics and augmented reality to autonomous driving and infrastructure inspection.
Papers
Semantic Landmark Detection & Classification Using Neural Networks For 3D In-Air Sonar
Wouter Jansen, Jan Steckel
SLAM-based Joint Calibration of Multiple Asynchronous Microphone Arrays and Sound Source Localization
Jiang Wang, Yuanzheng He, Daobilige Su, Katsutoshi Itoyama, Kazuhiro Nakadai, Junfeng Wu, Shoudong Huang, Youfu Li, He Kong