This repository has been archived by the owner on Nov 17, 2023. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 6.8k
[Optimizer][Bug] Gradient is mutated in the Adam optimizer #15759
Comments
Hey, this is the MXNet Label Bot. |
sxjscience
changed the title
[Optimizer] Gradient is mutated in the Adam optimizer
[Optimizer][Bug] Gradient is mutated in the Adam optimizer
Aug 6, 2019
kshitij12345
added a commit
to kshitij12345/incubator-mxnet
that referenced
this issue
Aug 6, 2019
7 tasks
On looking at other optimizer's updater, RMSPropAlexUpdate |
Thanks @kshitij12345 ! |
gyshi
pushed a commit
to gyshi/incubator-mxnet
that referenced
this issue
Sep 7, 2019
apache#15768) * update code to fix apache#15759 * add relevant test * re-add the removed conditional dispatch * fix grad mutate for ftrl_update * add test for ftrl_update * fix grad mutate for rmspropalex_update * add test for rmspropalex_update * use KERNEL_ASSIGN in RMSPropAlexUpdateKernel. * fix grad mutate for rmsprop_update * add test for rmsprop_update * add more optimizers for mutation test * retrigger CI * retrigger CI * retrigger CI * retrigger CI * address comments. * refactor code. * retrigger CI * retrigger CI * retrigger CI
gyshi
pushed a commit
to gyshi/incubator-mxnet
that referenced
this issue
Sep 7, 2019
apache#15768) * update code to fix apache#15759 * add relevant test * re-add the removed conditional dispatch * fix grad mutate for ftrl_update * add test for ftrl_update * fix grad mutate for rmspropalex_update * add test for rmspropalex_update * use KERNEL_ASSIGN in RMSPropAlexUpdateKernel. * fix grad mutate for rmsprop_update * add test for rmsprop_update * add more optimizers for mutation test * retrigger CI * retrigger CI * retrigger CI * retrigger CI * address comments. * refactor code. * retrigger CI * retrigger CI * retrigger CI
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
In the implementation of Adam: grad, mean and var are all mutated (See https://github.com/apache/incubator-mxnet/blob/master/src/operator/optimizer_op-inl.h#L1307-L1315). However, the
FMutateInput
flag is only set to {2, 3}, which should be {1, 2, 3}. (See https://github.com/apache/incubator-mxnet/blob/master/src/operator/optimizer_op.cc#L699)To reproduce the bug, you may check the value of the gradient before/after
adam_update
https://github.com/apache/incubator-mxnet/blob/master/python/mxnet/optimizer/optimizer.py#L1226-L1227 .We can add the following into
optimizer.py
Create an adam optimizer with
wd=1E-3
and train any model. You will find that the gradient has been mutated.The text was updated successfully, but these errors were encountered: