Paper ID: 2406.16713
ShanghaiTech Mapping Robot is All You Need: Robot System for Collecting Universal Ground Vehicle Datasets
Bowen Xu, Xiting Zhao, Delin Feng, Yuanyuan Yang, Sören Schwertfeger
This paper presents the ShanghaiTech Mapping Robot, a state-of-the-art unmanned ground vehicle (UGV) designed for collecting comprehensive multi-sensor datasets to support research in robotics, Simultaneous Localization and Mapping (SLAM), computer vision, and autonomous driving. The robot is equipped with a wide array of sensors including RGB cameras, RGB-D cameras, event-based cameras, IR cameras, LiDARs, mmWave radars, IMUs, ultrasonic range finders, and a GNSS RTK receiver. The sensor suite is integrated onto a specially designed mechanical structure with a centralized power system and a synchronization mechanism to ensure spatial and temporal alignment of the sensor data. A 16-node on-board computing cluster handles sensor control, data collection, and storage. We describe the hardware and software architecture of the robot in detail and discuss the calibration procedures for the various sensors and investigate the interference for LiDAR and RGB-D sensors. The capabilities of the platform are demonstrated through an extensive outdoor dataset collected in a diverse campus environment. Experiments with two LiDAR-based and two RGB-based SLAM approaches showcase the potential of the dataset to support development and benchmarking for robotics. To facilitate research, we make the dataset publicly available along with the associated robot sensor calibration data: this https URL
Submitted: Jun 24, 2024