Local Navigation
Local navigation research focuses on enabling robots and autonomous systems to efficiently and safely navigate complex environments, achieving goals specified through various means, including language or visual cues. Current efforts concentrate on improving perception (e.g., using multi-sensor fusion, 3D reconstruction, and vision-language models) and planning (e.g., employing reinforcement learning, model predictive control, and A* search algorithms) in dynamic and cluttered settings. These advancements are crucial for applications ranging from autonomous vehicles and drones to assistive technologies for visually impaired individuals, impacting both robotics and accessibility.
Papers
Navigation and Control of Unconventional VTOL UAVs in Forward-Flight with Explicit Wind Velocity Estimation
Mitchell Cohen, James Richard Forbes
Pushing the Limits of Cross-Embodiment Learning for Manipulation and Navigation
Jonathan Yang, Catherine Glossop, Arjun Bhorkar, Dhruv Shah, Quan Vuong, Chelsea Finn, Dorsa Sadigh, Sergey Levine
Simultaneous Calibration and Navigation (SCAN) of Multiple Ultrasonic Local Positioning Systems
David Gualda, Jesus Urena, Juan C. Garcia, Enrique Garcia, Jose Alcala
Prediction-for-CompAction: navigation in social environments using generalized cognitive maps
José Antonio Villacorta Atienza, Carlos Calvo Tapia, Valeriy A. Makarov Slizneva