-
Notifications
You must be signed in to change notification settings - Fork 148
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Same cell for word and sentence level #24
Comments
AFAIR Cell is/was a template/factory, not a layer or parameter container |
I think I am getting you. BNLSTMCell is just a declaration of a class. Still when you say call BNLSTMCell.call(), it will still call same set of parameters which are already defined in the graph at word level as they fall in same namescope of defined parameters. So treated as one unique set of parameters, not two (word and sentence). I am seeing a big performance difference by making this change. ( i am trying in a little more complex problem of multi-class multi-label ) cell_word = BNLSTMCell(40, is_training) # h-h batchnorm LSTMCell cell_sent = BNLSTMCell(40, is_training) # h-h batchnorm LSTMCell Similarly if you expect a different cell for forward and backward then you should define two more cells. please correct me if I am wrong. Other small side thing, according to my understanding, it's a general practice to not to dropout at eval time but it's happening as defined in the code here. |
You might be right, my memory of TF's conventions is quite vague at this point. |
In worker.py looking at lines 70-80, it seems you are using the same cell for word and sentence level, but it should be a different lstm cell
The text was updated successfully, but these errors were encountered: