Paper ID: 2409.09899
Semantic2D: A Semantic Dataset for 2D Lidar Semantic Segmentation
Zhanteng Xie, Philip Dames
This paper presents a 2D lidar semantic segmentation dataset to enhance the semantic scene understanding for mobile robots in different indoor robotics applications. While most existing lidar semantic datasets focus on 3D lidar sensors and autonomous driving scenarios, the proposed 2D lidar semantic dataset is the first public dataset for 2D lidar sensors and mobile robots. It contains data collected in six different indoor environments and has nine categories of typical objects in indoor environments. A novel semi-automatic semantic labeling framework is proposed to provide point-wise annotation for the dataset with minimal human effort. Based on this 2D lidar dataset, a hardware-friendly stochastic semantic segmentation benchmark is proposed to enable 2D lidar sensors to have semantic scene understanding capabilities. A series of segmentation tests are performed to demonstrate that the proposed learning-based segmentation benchmark can achieve more accurate and richer segmentation for each lidar point compared to traditional geometry-based extraction algorithms.
Submitted: Sep 15, 2024