This repository has been archived by the owner on Nov 17, 2023. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 6.8k
Operator _backward_FullyConnected is non-differentiable because it didn't register FGradient attribute. #12529
Labels
Comments
vresion: mxnet --pre |
Hi @PistonY thanks for submitting the issue |
Is this a bug? |
@PistonY currently, MXNet doesn't support the second-order differentiation. |
@nswamy Please change the label type from Bug to Feature Request. thx |
ok,hope mxnet support it soon. |
Do you have any updates for the request? I want to train wgan-gp with gluon. |
Shouldn't my PR fix this issue? Here 2nd order gradient for fully connected was implemented. |
I think we could close this |
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
I'm trying to implement WGAN-GP in Gluon.There is a "gradient_penalty" in this paper.I wrote it as this
but when I backward with this an error raised 'Operator _backward_FullyConnected is non-differentiable because it didn't register FGradient attribute.'
complete code is here
How can I solve it?
The text was updated successfully, but these errors were encountered: