SLAM Framework

Simultaneous Localization and Mapping (SLAM) frameworks aim to build maps of an environment while simultaneously tracking a robot's location within that map, using sensor data like LiDAR, cameras, and IMUs. Current research emphasizes robustness and efficiency across diverse environments and hardware constraints, exploring techniques like multi-modal sensor fusion (e.g., combining LiDAR and visual data), hierarchical approaches for pose estimation, and the incorporation of object-level features beyond basic geometric points. These advancements are crucial for enabling reliable autonomous navigation in robots for applications ranging from commercial service robots to autonomous vehicles.

Papers