Environment Perception
Environment perception in robotics and autonomous systems focuses on enabling machines to accurately understand their surroundings, a crucial step for safe and effective operation. Current research emphasizes robust perception under challenging conditions (e.g., varying lighting, adverse weather) using multimodal sensor fusion (combining LiDAR, radar, and cameras) and advanced deep learning architectures like convolutional neural networks and transformers, often incorporating techniques like continual learning and uncertainty quantification. These advancements are driving progress in diverse applications, including autonomous driving, drone racing, and assistive robotics, by improving safety, efficiency, and adaptability in complex environments.
Papers
Identification of Threat Regions From a Dynamic Occupancy Grid Map for Situation-Aware Environment Perception
Matti Henning, Jan Strohbeck, Michael Buchholz, Klaus Dietmayer
Situation-Aware Environment Perception for Decentralized Automation Architectures
Matti Henning, Michael Buchholz, Klaus Dietmayer