This is the official repo for the implementation of Hierarchical Prior Mining for Non-local Multi-View Stereo (Chunlin Ren, Qingshan Xu, Shikun Zhang, Jiaqi Yang, ICCV 2023).
In this work, we propose a Hierarchical Prior Mining for Non-local Multi-View Stereo (HPM-MVS). The key characteristics are the following techniques that exploit non-local information to assist MVS: 1) A Non-local Extensible Sampling Pattern (NESP), which is able to adaptively change the size of sampled areas without becoming snared in locally optimal solutions. 2) A new approach to leverage non-local reliable points and construct a planar prior model based on K-Nearest Neighbor (KNN), to obtain potential hypotheses for the regions where prior construction is challenging. 3) A Hierarchical Prior Mining (HPM) framework, which is used to mine extensive non-local prior information at different scales to assist 3D model recovery, this strategy can achieve a considerable balance between the reconstruction of details and low-textured areas. Experimental results on the ETH3D and Tanks & Temples have verified the superior performance and strong generalization capability of our method.
- The initial version for HPM-MVS++ has been released.
The code has been tested on Windows 10 with RTX 3070.
- NESP+ACMM, NESP+ACMP, NESP+ACMMP
cmake
CUDA >= 6.0
OpenCV >= 2.4 - HPM-MVS
cmake
CUDA >= 6.0
OpenCV >= 2.4
PCL >= 1.7
- Compile
mkdir build
cd build
cmake ..
make
- Test
Use script colmap2mvsnet_acm.py to convert COLMAP SfM result to MVS input
Run ./xxx $data_folder to get reconstruction results (./xxx represents the project name)
If you find our work useful in your research, please consider citing:
@InProceedings{Ren_2023_ICCV,
author = {Ren, Chunlin and Xu, Qingshan and Zhang, Shikun and Yang, Jiaqi},
title = {Hierarchical Prior Mining for Non-local Multi-View Stereo},
booktitle = {Proc. IEEE/CVF International Conference on Computer Vision},
month = {October},
year = {2023},
pages = {3611-3620}
}
This code largely benefits from the following repositories: ACMH, ACMP, ACMMP. Thanks to their authors for opening source of their excellent works.