-
Notifications
You must be signed in to change notification settings - Fork 6.8k
求助:unrolled lstm 如何对输入层共享使用batch norm的aux_states参数 #3076
Comments
Could you revise your questions in English? Since we still have many users who cannot understand Chinese. |
@mu not really, LSTM Bn problem need to be fixed after nnvm
|
I have the same problem. How to share the auxiliary states and weights of BatchNorm layer? |
This issue is closed due to lack of activity in the last 90 days. Feel free to reopen if this is still an active issue. Thanks! |
对于unrolled lstm, 假设序列长度为m, 则将recurrent部门展开有m个input和output, 对input使用batchnorm, 则会建立m个batchnorm节点。
在GrapExecutor::InitDataEntryInfo函数中, 会根据节点数建立op_nodes, 每个op_nodes处按照op->ListAuxiliaryStates()的数量建立aux_states, 则会建立m个aux_states。
如果希望m个batchnorm像lstm参数一样进行共享,那么如何操作才能使得m个aux_states共享呢?
The text was updated successfully, but these errors were encountered: