This folder contains MATLAB processing scripts to generate the datasets used in 3DFeat-Net paper, in their respective subfolders.
- Oxford Robotcar
- KITTI
- ETH (Only results are provided, see section for details)
Details of the data format can be found in the last section.
The scripts are found in the folder "oxford".
It contains scripts for generating the training data for Oxford Robotcar dataset used in 3DFeat-Net. The test data requires manual work, so we provide it as a direct download (see here).
You'll first need register for an account to download the raw Oxford dataset manually at the Oxford Robotcar dataset website.
For our work, we use 35 training + 5 test trajectories, with the remaining filtered out for various reasons, e.g. Bad GPS, poor quality Lidar scans due to rain, etc. The trajectories used can be found in datasets_train.txt
and datasets_test.txt
respectively.
For each of the datasets, download (1) LMS Front , and (2) GPS data. Unzip all the download files into the same directory, so each folder should contain the gps and lms_front subfolders.
Generating the training data comprises running two scripts:
-
oxford_build_pointclouds.m
: This script accumulates the line scans into 3D point clouds. Before running, set the following two variables at the top of the script:-
FOLDER: Point to the folder containing the raw data, containing the unzipped data in the previous section.
-
DST_FOLDER: Destination folder to store the unzipped data
-
-
oxford_generate_test_cases.m
: This generates the positive and non-negatives training triplet (we do not store the negatives since they're too many). As before, set DST_FOLDER to the same directory as the previous step.
After running the scripts, DST_FOLDER should contain the generated point clouds. DST_FOLDER/train.txt will be in the following format:
[bin-file] |
, where
We provide the test data as a direct download, which has been pairwise registered (using ICP) and manually cleaned up.
- Descriptor matching: The 30,000 cluster pairs for evaluating descriptor matching (Fig. 3) can be downloaded from here. The clusters have been cropped to 4.0m radius, but note that the results in the paper consider a 2m radii for all descriptors.
- Detection+Feature Description, and Registration: The test models can be downloaded from here. They are generated from oxford_build_pointclouds above, but have been randomly rotated and downsampled to evaluate rotational equivariances and robustness to sparse point clouds.
The folder "kitti" contains the scripts to process the KITTI dataset for evaluation.
Download the odometry dataset from the KITTI website. You'll need to download (1) velodyne laser data, (2) ground truth poses, and (3) calibration files. Unzip the files into the same folder, and you'll end up with 2 subfolders:
- poses (containing 00.txt, 01.txt, ..., 10.txt)
- sequences (containing 21 subfolders each containing the velodyne data in a folder "velodyne" and calib.txt)
Open process_kitti_data.m, set KITTI_FOLDER to point to the folder from the previous step, and run the script. The processed data will be stored in OUTPUT_FOLDER.
For this dataset, we did not do much preprocessing (other than voxelgrid filtering) of the raw data. We instead provide a copy of our computed keypoints and descriptors, which can be downloaded from here.
The computed descriptors are stored in two folders according to their dataset: 1) gazebo_winter, and 2) wood_autumn. Each .bin file corresponds to keypoints+descriptors for the respective point clouds. Note however that Hokuyo_-1.bin
contains the keypoints+descriptors for the global point cloud constructed from individual point clouds for the other season, i.e. wood_autumn\Hokuyo_-1.bin
contains the results for Wood summer point clouds.
Each .bin file is stored in binary format containing single precision floats:
- Each bin file in the processed directory is a binary file containing Nx6 float32 for the N points in the point cloud:
$(x_1, y_1, z_1, Nx_1, Ny_1, Nz_1), (x_2, y_2, z_2, Nx_2, Ny_2, Nz_2)$ , ... -
groundtruths.txt
contains the transformation between each of the local point clouds and the global point cloud. See MATLAB script[ROOT]/scripts/show_alignment.m
to understand how to interpret the transformation.