Collective Perception
Collective perception research focuses on enabling multiple agents (robots, vehicles, or even humans) to collaboratively build a shared understanding of their environment, surpassing the limitations of individual sensors or perspectives. Current research emphasizes developing efficient fusion algorithms, such as multi-resolution voxel grid fusion for LiDAR data and agent-based learning frameworks, to integrate diverse sensor information and improve accuracy while minimizing computational costs. This field is crucial for advancing autonomous driving, multi-robot systems, and even understanding human social perception, with applications ranging from safer autonomous vehicles to more robust and efficient robotic swarms.
Papers
October 28, 2024
October 9, 2024
September 26, 2024
August 12, 2024
August 6, 2024
May 27, 2024
April 29, 2024
January 17, 2024
December 25, 2023
December 19, 2023
November 6, 2023
September 11, 2023
May 29, 2023
November 6, 2022
September 26, 2022
July 14, 2022