| Webpage | Full Paper |
TL;DR: A comprehensive multi-LiDAR, multi-scenario dataset that extensively incorporates segments of real-world geometric degeneracy.
- (20241025) The metadata.json file for the OS1-64 used in the Beta platform can be accessed via the following link: Google Drive.
- (20240910) Data can be downloaded from GEODE - Google Drive.
- (20240910) Dataset README is avaliable.
- Review the overview of the GEODE dataset, including details about sensors, definitions of ROS topics and messages, and important caveats regarding localization evaluation.
- Download the dataset from GEODE - Google Drive. Additional information about each sequence and scenario is available on our homepage. Additional download options for users in Mainland China will be made available in the future. Currently, we only provide ROS1 bags. ROS2 users can convert these using the rosbags-convert toolkit.
- Adapt your SLAM algorithm using the provided dataset parameters, and calculate the error after obtaining the results.
- Device
$\alpha$ :- Velodyne VLP-16 ;
- Stereo HikRobot MV-CS050-10GC cameras;
- Xsens MTi-30 IMU;
- Device
$\beta$ :- Ouster OS1-64;
- Stereo HikRobot MV-CS050-10GC cameras;
- Xsens MTi-30 IMU;
- Device
$\gamma$ :- Livox AVIA;
- Stereo HikRobot MV-CS050-10GC cameras;
- Xsens MTi-30 IMU;
The GEODE dataset provides sensor raw data and corresponding rosbag.
Sensor raw data with * is only available for
The calibration results are stored in three files, alpha.yaml
, beta.yaml
and gamma.yaml
, according to the acquisition device.
We have provided the script 'rmse.py' for everyone to calculate the localization accuracy of the algorithms they run.
python3 rmse.py <Your traj> <GT traj> <time offset>
For example, running the following command will calculate the error between all trajectories in TUM format that contain the relead
field and the ground truth trajectories, and then compute the average value.
python3 rmse.py relead <GT traj> 0
For the sequences 'off-road', 'inland_waterways', and 'Metro_Tunnels_Tunneling', three sets of equipment were mounted on a rack constructed from aluminum profiles to simultaneously collect data, while only one set of GT pose equipment was utilized to capture motion trajectories. Consequently, the trajectories obtained from the algorithm need to be processed before proceeding with subsequent error calculations. Fortunately, due to the effectiveness of the time synchronization scheme, we only need to account for the spatial offsets between different sensors in these sequences.
For the 'off-road' and 'inland_waterways' sequences, where GT poses are collected using GNSS/INS, we align the GT poses to the coordinate system of the beta device. This alignment allows the trajectories derived from the beta device's data, processed by the algorithm, to be directly used for error calculation. For the alpha and carol devices, the trajectories from the algorithm are transformed into the GT pose coordinate system using the scripts alpha2gt_gnss.py
and ``gamma2gt_gnss.py` before error calculations are performed.
In the 'Metro_Tunnels_Tunneling' sequence, where GT poses are obtained by tracking prisms with a Leica MS60, we align the true values to the alpha device. The beta and gamma devices then convert the algorithm's trajectories to the GT pose coordinate system using the scripts beta2gt_leica.py
and carol2gt_leica.py
. The necessity of using scripts to transform coordinate systems, which requires additional operations to calculate errors, arises from the fact that the GT poses from the Leica tracking prisms include only positions, not attitudes. It is challenging to convert the true values to the other two devices using the results of multi-aLiDAR calibration. To maintain a unified processing approach, we adopt the same method for sequences recorded simultaneously.
For the “stair” sequence, we obtain the ground truth pose by using the PALoc algorithm to align the sensor data with the ground truth map. However, due to the small field of view of the Livox Avia LiDAR equipped with the
Click the button below to access detailed information (including scenarios, degeneration types, etc.) and to download the dataset.
Awesome-Algorithms-Against-Degeneracy
-
SubT-MRS: Pushing SLAM Towards All-weather Environments, CVPR, 2024. [Paper] [website]
-
ENWIDE Dataset (related paper: COIN-LIO: Complementary Intensity-Augmented LiDAR Inertial Odometry, ICRA, 2024. [arXiv] [code])
-
LiDAR Degeneracy Datasets (related paper: Degradation Resilient LiDAR-Radar-Inertial Odometry, ICRA, 2024. [arXiv])
-
WHU-Helmet: A helmet-based multi-sensor SLAM dataset for the evaluation of real-time 3D mapping in large-scale GNSS-denied environments, IEEE Transactions on Geoscience and Remote Sensing, 2023. [paper] [website]
-
Open-source datasets released by the SubT teams
- Heterogeneous LiDAR Dataset for Benchmarking Robust Localization in Diverse Degenerate Scenarios Zhiqiang Chen, Yuhua Qi, Dapeng Feng, Xuebin Zhuang, Hongbo Chen, Xiangcheng Hu, Jin Wu, Kelin Peng, Peng Lu Under Review [Arxiv]
If you have any other issues, please report them on the repository.