Skip to content

Commit

Permalink
init
Browse files Browse the repository at this point in the history
  • Loading branch information
xzeng committed Jan 23, 2023
1 parent 1118b9b commit 1d24a78
Show file tree
Hide file tree
Showing 134 changed files with 18,308 additions and 10 deletions.
14 changes: 14 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
__pycache__/
*__pycache__
.idea/
*.pyc
*.m
.ipynb_checkpoints
*swp
*swo
*__pycache__*
models/pvcnn/functional/build/
*.sh
lion_ckpt
data/
datasets/test_data
76 changes: 66 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,20 +1,76 @@
## <p align="center">LION: Latent Point Diffusion Models for 3D Shape Generation<br><br> NeurIPS 2022 </p>
<div align="center">
<a href="https://www.cs.utoronto.ca/~xiaohui/" target="_blank">Xiaohui&nbsp;Zeng</a> &emsp; <b>&middot;</b> &emsp;
<a href="http://latentspace.cc/" target="_blank">Arash&nbsp;Vahdat</a> &emsp; <b>&middot;</b> &emsp;
<a href="https://www.fwilliams.info/" target="_blank">Francis&nbsp;Williams</a> &emsp; <b>&middot;</b> &emsp;
<a href="https://zgojcic.github.io/" target="_blank">Zan&nbsp;Gojcic</a> &emsp; <b>&middot;</b> &emsp;
<a href="https://orlitany.github.io/" target="_blank">Or&nbsp;Litany</a> &emsp; <b>&middot;</b> &emsp;
<a href="https://www.cs.utoronto.ca/~fidler/" target="_blank">Sanja&nbsp;Fidler</a> &emsp; <b>&middot;</b> &emsp;
<a href="https://www.cs.utoronto.ca/~xiaohui/" target="_blank">Xiaohui&nbsp;Zeng</a> &emsp;
<a href="http://latentspace.cc/" target="_blank">Arash&nbsp;Vahdat</a> &emsp;
<a href="https://www.fwilliams.info/" target="_blank">Francis&nbsp;Williams</a> &emsp;
<a href="https://zgojcic.github.io/" target="_blank">Zan&nbsp;Gojcic</a> &emsp;
<a href="https://orlitany.github.io/" target="_blank">Or&nbsp;Litany</a> &emsp;
<a href="https://www.cs.utoronto.ca/~fidler/" target="_blank">Sanja&nbsp;Fidler</a> &emsp;
<a href="https://karstenkreis.github.io/" target="_blank">Karsten&nbsp;Kreis</a>
<br> <br>
<a href="https://arxiv.org/abs/2210.06978" target="_blank">Paper</a> &emsp;
<a href="https://nv-tlabs.github.io/LION" target="_blank">Project&nbsp;Page</a>
</div>
<br><br>
<p align="center">:construction: :pick: :hammer_and_wrench: :construction_worker:</p>
<p align="center">Here, we will release code and checkpoints in the near future! Stay tuned!</p>
<br><br>

<p align="center">
<img width="750" alt="Animation" src="assets/animation.gif"/>
</p>
## Install
* Dependencies:
* CUDA 11.6

* Setup the environment
Install from conda file
```
conda env create --name lion_env --file=env.yaml
conda activate lion_env
# Install some other packages
pip install git+https://github.com/openai/CLIP.git
# build some packages first (optional)
python build_pkg.py
```
Tested with conda version 22.9.0
## Demo
run `python demo.py`, will load the released text2shape model on hugging face and generate a chair point cloud.
## Released checkpoint and samples
* will be release soon
* put the downloaded file under `./lion_ckpt/`
## Training
### data
* ShapeNet can be downloaded [here](https://github.com/stevenygd/PointFlow#dataset).
* Put the downloaded data as `./data/ShapeNetCore.v2.PC15k` *or* edit the `pointflow` entry in `./datasets/data_path.py` for the ShapeNet dataset path.
### train VAE
* run `bash ./script/train_vae.sh $NGPU` (the released checkpoint is trained with `NGPU=4`)
### train diffusion prior
* require the vae checkpoint
* run `bash ./script/train_prior.sh $NGPU` (the released checkpoint is trained with `NGPU=8` with 2 node)
### evaluate a trained prior
* download the test data from [here](https://drive.google.com/file/d/1uEp0o6UpRqfYwvRXQGZ5ZgT1IYBQvUSV/view?usp=share_link), unzip and put it as `./datasets/test_data/`
* download the released checkpoint from above
```
checkpoint="./lion_ckpt/unconditional/airplane/checkpoints/model.pt"
bash ./script/eval.sh $checkpoint # will take 1-2 hour
```
## Evaluate the samples with the 1-NNA metrics
* download the test data from [here](https://drive.google.com/file/d/1uEp0o6UpRqfYwvRXQGZ5ZgT1IYBQvUSV/view?usp=share_link), unzip and put it as `./datasets/test_data/`
* run `python ./script/compute_score.py`
## Citation
```
@inproceedings{zeng2022lion,
title={LION: Latent Point Diffusion Models for 3D Shape Generation},
author={Xiaohui Zeng and Arash Vahdat and Francis Williams and Zan Gojcic and Or Litany and Sanja Fidler and Karsten Kreis},
booktitle={Advances in Neural Information Processing Systems (NeurIPS)},
year={2022}
}
```
3 changes: 3 additions & 0 deletions build_pkg.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
import clip
from models import pvcnn2
from utils import eval_helper
38 changes: 38 additions & 0 deletions datasets/data_path.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
# Copyright (c) 2022, NVIDIA CORPORATION & AFFILIATES. All rights reserved.
#
# NVIDIA CORPORATION & AFFILIATES and its licensors retain all intellectual property
# and proprietary rights in and to this software, related documentation
# and any modifications thereto. Any use, reproduction, disclosure or
# distribution of this software and related documentation without an express
# license agreement from NVIDIA CORPORATION & AFFILIATES is strictly prohibited.
import os


def get_path(dataname=None):
dataset_path = {}
dataset_path['pointflow'] = [
'./data/ShapeNetCore.v2.PC15k/'

]

if dataname is None:
return dataset_path
else:
assert(
dataname in dataset_path), f'not found {dataname}, only: {list(dataset_path.keys())}'
for p in dataset_path[dataname]:
print(f'searching: {dataname}, get: {p}')
if os.path.exists(p):
return p
ValueError(
f'all path not found for {dataname}, please double check: {dataset_path[dataname]}; or edit the datasets/data_path.py ')


def get_cache_path():
cache_list = ['/workspace/data_cache_local/data_stat/',
'/workspace/data_cache/data_stat/']
for p in cache_list:
if os.path.exists(p):
return p
ValueError(
f'all path not found for {cache_list}, please double check: or edit the datasets/data_path.py ')
Loading

0 comments on commit 1d24a78

Please sign in to comment.