SLAM Framework
Simultaneous Localization and Mapping (SLAM) frameworks aim to build maps of an environment while simultaneously tracking a robot's location within that map, using sensor data like LiDAR, cameras, and IMUs. Current research emphasizes robustness and efficiency across diverse environments and hardware constraints, exploring techniques like multi-modal sensor fusion (e.g., combining LiDAR and visual data), hierarchical approaches for pose estimation, and the incorporation of object-level features beyond basic geometric points. These advancements are crucial for enabling reliable autonomous navigation in robots for applications ranging from commercial service robots to autonomous vehicles.
Papers
November 16, 2024
October 31, 2024
July 19, 2024
June 28, 2024
November 28, 2023
October 20, 2023
October 25, 2022
September 16, 2022
March 6, 2022
February 17, 2022