Efficient Inference and Quantization of CGD for Image Retrieval with OpenVINO
This demo is base on CGD: A PyTorch implementation of CGD based on the paper Combination of Multiple Global Descriptors for Image Retrieval.
conda create -n CGD python=3.8
pip install openvino==2023.0.1 openvino-dev[pytorch,onnx]==2023.0.1 nncf==2.5.0 torch==2.0.1
Prepare dataset based on Standard Online Products
sudo mkdir -p /home/data/sop
sudo chmod -R 777 /home/data/sop
python data_utils.py --data_path /home/data
Downlaod pre-trained Pytorch Model ResNet50(SG) trained on SOP dataset
cp <PATH/TO/DIR>/sop_uncropped_resnet50_SG_1536_0.1_0.5_0.1_128_model.pth results
cp <PATH/TO/DIR>/sop_uncropped_resnet50_SG_1536_0.1_0.5_0.1_128_data_base.pth results
python test.py --query_img_name /home/data/sop/uncropped/281602463529_2.JPG \
--data_base sop_uncropped_resnet50_SG_1536_0.1_0.5_0.1_128_data_base.pth \
--retrieval_num 8
The leftmost query image serves as input to retrieve the 8 most similar image from the database, where the green bounding box means that the predicted class match the query image class, while the red bounding box means a mismatch of image class. Therefore, the retrieved image can be further filtered out with class information.
mkdir -p models
python run_quantize.py
Generated FP32 ONNX model and FP32/INT8 OpenVINO™ model will be saved in the models
directory. Besides, we also store evaluation results of OpenVINO™ FP32/INT8 model as a Database in the results
directory respectively. The database can be directly used for image retrieval via input query image.
python test.py --query_img_name /home/data/sop/uncropped/281602463529_2.JPG \
--data_base ov_fp32_model_data_base.pth \
--retrieval_num 8
python test.py --query_img_name /home/data/sop/uncropped/281602463529_2.JPG \
--data_base ov_int8_model_data_base.pth \
--retrieval_num 8
The Pytorch and OpenVINO™ FP32 retrieved images are the same. Although the 7th image of OpenVINO™ INT8 model results is not matched with FP32 model, it can be further filtered out with predicted class information.