Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Retro-fitting a pretrained model #26

Open
dean-sh opened this issue May 22, 2022 · 7 comments
Open

Retro-fitting a pretrained model #26

dean-sh opened this issue May 22, 2022 · 7 comments

Comments

@dean-sh
Copy link

dean-sh commented May 22, 2022

Hey,

Thank you for your implementation!
Is it possible to use your library to "retro-fit" a pretrained model?

I guess it would mean freezing the model during training, only fine-tuning the retrieval and cross-attention?
How would you recommend doing that?

Thanks!

@bjascob
Copy link

bjascob commented Jun 2, 2022

I'm interested in this as well but I haven't had time to work on it. The original paper "retrofitted" T5 by adding additional cross-attentions between the pretrained model and the KB retrieval/chunk system. They claimed it only took a small number of training steps to teach the revised model to utilize the new cross-attentions. I'm assuming this involved training all model weights on a masking task the same way that was done in the original pretraining.

It shouldn't be too difficult to hack up the Huffingface model code to add the cross attentions and then use the information retrieval components from here. I'll probably try this sometime in the next few months. I'm more interested in the Bart model so I was planning to work on that, not T5. Let me know if you or someone else get to it first.

@bling0830
Copy link

Thank you for your implementation!

I'm interested in how would you add CCA to Bart, in encoder or decoder? If in encoder, CCA is causal, How would you recommend solving this. If in decoder, retrieval needs 64 token at least. If generated text less than 64 token, retrieval would not be used.

Thanks!

@saisurbehera
Copy link

Has anyone of you worked on the retrofitting part yet?

@bjascob
Copy link

bjascob commented Nov 18, 2022

I haven't had the time and although I'm still somewhat insterested, realistically I probably won't get to this.

It might be worth emailing the authors of the original paper to see if they'd be willing to post that code or provide additional information on the retrofitting process. As I recall, there was only a paragraph or so on it. Seems like there's a number of details it would be helpful if they could provide.

@saisurbehera
Copy link

Yup, let me email them and hopefully they respond.

@saisurbehera
Copy link

I just got a no response from them

@ilyalasy
Copy link

Hey there, anyone had any time to work on this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants