You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.
It seems mxnet support implicit variable reuse in the example of rnn/lstm.py.
but when I tried to initialize a variable with a ndarray, mxnet gives an error:
Traceback (most recent call last): File "auto_sum_lstm.py", line 95, in <module> initializer = init) File "C:\Anaconda3\lib\site-packages\mxnet-0.5.0-py3.5.egg\mxnet\model.py", line 441, in __init__ self._check_arguments() File "C:\Anaconda3\lib\site-packages\mxnet-0.5.0-py3.5.egg\mxnet\model.py", line 469, in _check_arguments _check_arguments(self.symbol) File "C:\Anaconda3\lib\site-packages\mxnet-0.5.0-py3.5.egg\mxnet\executor_manager.py", line 62, in _check_arguments 'arguments are %s') % (name, str(arg_names))) ValueError: Find duplicated argument name "embed_weight", please make the weight name non-duplicated(using name arguments), arguments are ['data', 'embed_weight', 'sent_l0_i2h_weight', 'sent_l0_i2h_bias', 'sent_l0_init_h', 'sent_l0_h2h_weight', 'sent_l0_h2h_bias', 'sent_l0_init_c', 'doc_l0_i2h_weight', 'doc_l0_i2h_bias', 'embed_weight', 'sent_l0_i2h_weight', 'sent_l0_i2h_bias', 'sent_l0_init_h', 'sent_l0_h2h_weight', 'sent_l0_h2h_bias', 'sent_l0_init_c', 'embed_weight', 'sent_l0_i2h_weight', 'sent_l0_i2h_bias', 'sent_l0_init_h', 'sent_l0_h2h_weight', 'sent_l0_h2h_bias', 'sent_l0_init_c', 'doc_l0_init_h', 'doc_l0_h2h_weight', 'doc_l0_h2h_bias', 'doc_l0_init_c', 'dec_l0_i2h_weight', 'dec_l0_i2h_bias', 'dec_l0_init_h', 'dec_l0_h2h_weight', 'dec_l0_h2h_bias', 'dec_l0_init_c', 'cls_weight', 'cls_bias', 'label']
what I want to do is to initialize embed_weight with a pre-trained ndarray, and the embed_weight is used many times in the model. Is there any way to do that?
There seems to be a tricky way of giving them different names but initialize with the same value, but the problem is the variable might be too large and take a lot space.
Thanks
The text was updated successfully, but these errors were encountered:
It is less advised to create two variables with the same name as input to the same graph, because sometimes binding operation depends on the naming. It is totally fine to use same name in two different graphs
It seems mxnet support implicit variable reuse in the example of rnn/lstm.py.
but when I tried to initialize a variable with a ndarray, mxnet gives an error:
Traceback (most recent call last): File "auto_sum_lstm.py", line 95, in <module> initializer = init) File "C:\Anaconda3\lib\site-packages\mxnet-0.5.0-py3.5.egg\mxnet\model.py", line 441, in __init__ self._check_arguments() File "C:\Anaconda3\lib\site-packages\mxnet-0.5.0-py3.5.egg\mxnet\model.py", line 469, in _check_arguments _check_arguments(self.symbol) File "C:\Anaconda3\lib\site-packages\mxnet-0.5.0-py3.5.egg\mxnet\executor_manager.py", line 62, in _check_arguments 'arguments are %s') % (name, str(arg_names))) ValueError: Find duplicated argument name "embed_weight", please make the weight name non-duplicated(using name arguments), arguments are ['data', 'embed_weight', 'sent_l0_i2h_weight', 'sent_l0_i2h_bias', 'sent_l0_init_h', 'sent_l0_h2h_weight', 'sent_l0_h2h_bias', 'sent_l0_init_c', 'doc_l0_i2h_weight', 'doc_l0_i2h_bias', 'embed_weight', 'sent_l0_i2h_weight', 'sent_l0_i2h_bias', 'sent_l0_init_h', 'sent_l0_h2h_weight', 'sent_l0_h2h_bias', 'sent_l0_init_c', 'embed_weight', 'sent_l0_i2h_weight', 'sent_l0_i2h_bias', 'sent_l0_init_h', 'sent_l0_h2h_weight', 'sent_l0_h2h_bias', 'sent_l0_init_c', 'doc_l0_init_h', 'doc_l0_h2h_weight', 'doc_l0_h2h_bias', 'doc_l0_init_c', 'dec_l0_i2h_weight', 'dec_l0_i2h_bias', 'dec_l0_init_h', 'dec_l0_h2h_weight', 'dec_l0_h2h_bias', 'dec_l0_init_c', 'cls_weight', 'cls_bias', 'label']
what I want to do is to initialize
embed_weight
with a pre-trained ndarray, and theembed_weight
is used many times in the model. Is there any way to do that?There seems to be a tricky way of giving them different names but initialize with the same value, but the problem is the variable might be too large and take a lot space.
Thanks
The text was updated successfully, but these errors were encountered: