Depth Fusion
Depth fusion integrates data from multiple depth sources, aiming to improve the accuracy, robustness, and completeness of depth maps for various applications. Current research focuses on developing efficient fusion algorithms and architectures, such as transformer-based networks and those leveraging multi-scale or multi-view information, often incorporating techniques like attentive mechanisms and uncertainty estimation to handle noise and inconsistencies. These advancements are significantly impacting fields like autonomous driving, augmented reality, and 3D reconstruction by enabling more reliable and detailed 3D scene understanding from diverse sensor modalities.
Papers
November 7, 2024
August 29, 2024
August 1, 2024
July 26, 2024
June 25, 2024
June 6, 2024
May 17, 2024
March 18, 2024
March 8, 2024
December 8, 2023
October 17, 2023
August 23, 2023
August 17, 2023
May 18, 2023
March 31, 2023
December 3, 2022
August 26, 2022
June 8, 2022
April 24, 2022