Local Navigation
Local navigation research focuses on enabling robots and autonomous systems to efficiently and safely navigate complex environments, achieving goals specified through various means, including language or visual cues. Current efforts concentrate on improving perception (e.g., using multi-sensor fusion, 3D reconstruction, and vision-language models) and planning (e.g., employing reinforcement learning, model predictive control, and A* search algorithms) in dynamic and cluttered settings. These advancements are crucial for applications ranging from autonomous vehicles and drones to assistive technologies for visually impaired individuals, impacting both robotics and accessibility.
Papers
NaVIP: An Image-Centric Indoor Navigation Solution for Visually Impaired People
Jun Yu, Yifan Zhang, Badrinadh Aila, Vinod Namboodiri
BoxMap: Efficient Structural Mapping and Navigation
Zili Wang, Christopher Allum, Sean B. Andersson, Roberto Tron
Neural Signal Operated Intelligent Robot: Human-guided Robot Maze Navigation through SSVEP
Jiarui Tang, Tingrui Sun, Siwen Wang
UAV-Assisted Self-Supervised Terrain Awareness for Off-Road Navigation
Jean-Michel Fortin, Olivier Gamache, William Fecteau, Effie Daum, William Larrivée-Hardy, François Pomerleau, Philippe Giguère
Navigation in a simplified Urban Flow through Deep Reinforcement Learning
Federica Tonti, Jean Rabault, Ricardo Vinuesa
Safety Verification and Navigation for Autonomous Vehicles based on Signal Temporal Logic Constraints
Aditya Parameshwaran, Yue Wang
Safety-critical Locomotion of Biped Robots in Infeasible Paths: Overcoming Obstacles during Navigation toward Destination
Jaemin Lee, Min Dai, Jeeseop Kim, Aaron D. Ames
Underwater robot guidance, navigation and control in fish net pens
Sveinung Johan Ohrem