Inertial Fusion
LiDAR-inertial fusion integrates data from Light Detection and Ranging (LiDAR) sensors and inertial measurement units (IMUs) to achieve highly accurate and robust localization and mapping (SLAM) for robots and autonomous systems. Current research emphasizes developing efficient and resilient algorithms, often employing factor graphs or transformer-based architectures, to handle sensor noise, data degeneracy (e.g., in textureless environments), and failures in individual sensor modalities, including the incorporation of multiple cameras or radar. These advancements are crucial for improving the reliability and performance of autonomous navigation in challenging and unpredictable environments, with applications ranging from robotics and autonomous vehicles to 3D scene reconstruction and digital twin creation.