This repository has been archived by the owner on Jan 15, 2024. It is now read-only.
Trainable = False for layers/embedding layer #333
Answered
by
szha
sravanbabuiitm
asked this question in
Q&A
-
Hi, While training a deep network with pretrained embeddings, LSTM , one would have a choice to freeze the pretrained embeddings or let the pretrained embeddings get trained too. Other than not setting a trainer for the embeddings layer in which case it wont get updated, is there a diff way to explicitly mention while creating the layer to not train/update it ? I need to create a trainer for rest of the network, which I m trying to avoid. |
Beta Was this translation helpful? Give feedback.
Answered by
szha
Sep 12, 2018
Replies: 1 comment
-
you can do |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
szha
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
you can do
net.layer1.collect_params().setattr('grad_req', 'null')
to freeze the weights in layer1.