Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why not perform the transfer learning setting? #5

Open
ha-lins opened this issue Sep 29, 2021 · 1 comment
Open

Why not perform the transfer learning setting? #5

ha-lins opened this issue Sep 29, 2021 · 1 comment

Comments

@ha-lins
Copy link

ha-lins commented Sep 29, 2021

Hi @mengyings,

Thanks for the great work! I have a minor question as the title shows. You say: We do not evaluate transfer learning setting in this paper where a pretrained model is applied to another dataset. The transfer learning setting could be more challenging and practical. Following Hu et. al, you can pretrain on the unlabeled ZINC15 database, and then evaluated on these downstream tasks. Could it be possible to perform this setting for MoCL-DK? Thanks!

Reference:
[1] Strategies to pre-train graph neural networks. Hu et al. 2020.

@mengyings
Copy link
Collaborator

Yes, we are currently pretraining on the ZINC dataset and will release the pre-trained model once done.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants