This project provides Pytorch implementation for Shadow Knowledge Distillation: Bridging Offline and Online Knowledge Transfer.
Open source code and training logs are under preparation, please keep patience, thanks!
Training logs and models: https://pan.baidu.com/s/1hXe6iTCFw8nD_heDpCh1ag (shak)
This repo is partly based on the following repos, thank the authors a lot.
If you find that this project helps your research, please consider citing some of the following papers:
@inproceedings{li2022shake,
author = {Lujun Li and Jin Zhe},
title = {Shadow Knowledge Distillation: Bridging Offline and Online Knowledge Transfer},
booktitle = {Thirty-sixth Conference on Neural Information Processing Systems (NeurIPS)},
year = {2022}
}