Monocular SLAM
Monocular SLAM (Simultaneous Localization and Mapping) aims to reconstruct 3D environments and simultaneously track a camera's position using only a single camera, a challenging problem due to inherent scale ambiguity. Current research focuses on improving accuracy and efficiency through novel approaches like Gaussian splatting, deep learning-based feature extraction and tracking (e.g., using deep keypoints or patch-based methods), and incorporating additional sensor data (e.g., inertial measurements or LiDAR). These advancements are driving progress in applications such as augmented reality, robotics, and medical imaging, particularly in scenarios where deploying multiple cameras or other sensors is impractical or impossible.
Papers
November 8, 2024
October 19, 2024
September 14, 2024
August 31, 2024
July 28, 2024
March 12, 2024
February 6, 2024
January 19, 2024
January 17, 2024
December 11, 2023
August 4, 2023
August 1, 2023
July 23, 2023
July 15, 2023
June 14, 2023
June 12, 2023
November 29, 2022
October 5, 2022