This repository contains the author's implementation in Pytorch for the paper:
Relation-Shape Convolutional Neural Network for Point Cloud Analysis [arXiv] [CVF]
Yongcheng Liu, Bin Fan, Shiming Xiang and Chunhong Pan
CVPR 2019 Oral & Best paper finalist Project Page: https://yochengliu.github.io/Relation-Shape-CNN/
If our paper is helpful for your research, please consider citing:
@inproceedings{liu2019rscnn,
author = {Yongcheng Liu and
Bin Fan and
Shiming Xiang and
Chunhong Pan},
title = {Relation-Shape Convolutional Neural Network for Point Cloud Analysis},
booktitle = {IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
pages = {8895--8904},
year = {2019}
}
- Ubuntu 14.04
- Python 3 (recommend Anaconda3)
- Pytorch 0.3.*/0.4.*
- CMake > 2.8
- CUDA 8.0 + cuDNN 5.1
git clone https://github.com/Yochengliu/Relation-Shape-CNN.git
cd Relation-Shape-CNN
- mkdir build && cd build
- cmake .. && make
Shape Classification
Download and unzip ModelNet40 (415M). Replace $data_root$
in cfgs/config_*_cls.yaml
with the dataset parent path.
ShapeNet Part Segmentation
Download and unzip ShapeNet Part (674M). Replace $data_root$
in cfgs/config_*_partseg.yaml
with the dataset path.
sh train_cls.sh
You can modify relation_prior
in cfgs/config_*_cls.yaml
. We have trained a Single-Scale-Neighborhood classification model in cls
folder, whose accuracy is 92.38%.
sh train_partseg.sh
We have trained a Multi-Scale-Neighborhood part segmentation model in seg
folder, whose class mIoU and instance mIoU is 84.18% and 85.81% respectively.
Voting script: voting_evaluate_cls.py
You can use our model cls/model_cls_ssn_iter_16218_acc_0.923825.pth
as the checkpoint in config_ssn_cls.yaml
, and after this voting you will get an accuracy of 92.71% if all things go right.
Voting script: voting_evaluate_partseg.py
You can use our model seg/model_seg_msn_iter_57585_ins_0.858054_cls_0.841787.pth
as the checkpoint in config_msn_partseg.yaml
.
The code is released under MIT License (see LICENSE file for details).
The code is heavily borrowed from Pointnet2_PyTorch.
If you have some ideas or questions about our research to share with us, please contact [email protected]