Local Navigation
Local navigation research focuses on enabling robots and autonomous systems to efficiently and safely navigate complex environments, achieving goals specified through various means, including language or visual cues. Current efforts concentrate on improving perception (e.g., using multi-sensor fusion, 3D reconstruction, and vision-language models) and planning (e.g., employing reinforcement learning, model predictive control, and A* search algorithms) in dynamic and cluttered settings. These advancements are crucial for applications ranging from autonomous vehicles and drones to assistive technologies for visually impaired individuals, impacting both robotics and accessibility.
Papers
Safety Verification and Navigation for Autonomous Vehicles based on Signal Temporal Logic Constraints
Aditya Parameshwaran, Yue Wang
Safety-critical Locomotion of Biped Robots in Infeasible Paths: Overcoming Obstacles during Navigation toward Destination
Jaemin Lee, Min Dai, Jeeseop Kim, Aaron D. Ames
Underwater robot guidance, navigation and control in fish net pens
Sveinung Johan Ohrem
Neuromorphic Perception and Navigation for Mobile Robots: A Review
A. Novo, F. Lobon, H. G. De Marina, S. Romero, F. Barranco
LuSNAR:A Lunar Segmentation, Navigation and Reconstruction Dataset based on Muti-sensor for Autonomous Exploration
Jiayi Liu, Qianyu Zhang, Xue Wan, Shengyang Zhang, Yaolin Tian, Haodong Han, Yutao Zhao, Baichuan Liu, Zeyuan Zhao, Xubo Luo