Fine tune arguments to learn new knowledge without forgetting previous #996
Unanswered
SutirthaChakraborty
asked this question in
Q&A
Replies: 1 comment 1 reply
-
Continual Learning might help you |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
What should be the fine-tuning argument to make sure we are just adding knowledge to the existing pretrained model, without affecting their past training and learn few of my new words with that ?
When I am retraining, it's forgetting the pretrained vocab. I uses small learning rate and wd.
Do anyone has the perfect argument values to do that ?
Beta Was this translation helpful? Give feedback.
All reactions