Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Operator _backward_FullyConnected is non-differentiable because it didn't register FGradient attribute. #12529

Closed
PistonY opened this issue Sep 12, 2018 · 9 comments

Comments

@PistonY
Copy link

PistonY commented Sep 12, 2018

I'm trying to implement WGAN-GP in Gluon.There is a "gradient_penalty" in this paper.I wrote it as this

def calc_gradient_penalty(netD, real_data, fake_data, LAMBDA, ctx):
    real_data = real_data.as_in_context(ctx)
    b_s = real_data.shape[0]
    alpha = nd.random.uniform(0, 1, shape=(b_s, 1), ctx=ctx)
    alpha = alpha.broadcast_to(real_data.shape)
    interpolates = alpha * real_data + ((1 - alpha) * fake_data)

    interpolates = nd.array(interpolates, ctx=ctx)
    interpolates.attach_grad()
    disc_interpolates = netD(interpolates)
    gradients = autograd.grad(heads=disc_interpolates, variables=interpolates,
                              head_grads=nd.ones(shape=disc_interpolates.shape, ctx=ctx),
                              create_graph=True, retain_graph=True, train_mode=True)[0]

    gradients = gradients.reshape((gradients.shape[0], -1))
    gradient_penalty = ((gradients.norm(2, axis=1, keepdims=True) - 1) ** 2).mean() * LAMBDA
    return gradient_penalty

but when I backward with this an error raised 'Operator _backward_FullyConnected is non-differentiable because it didn't register FGradient attribute.'
complete code is here
How can I solve it?

@PistonY
Copy link
Author

PistonY commented Sep 12, 2018

vresion: mxnet --pre
os: Ubuntu 16.04
cuda: 9.0

@kalyc
Copy link
Contributor

kalyc commented Sep 12, 2018

Hi @PistonY thanks for submitting the issue
@mxnet-label-bot[Bug, Operator]

@PistonY
Copy link
Author

PistonY commented Sep 13, 2018

Is this a bug?
I got in trouble with it for so long.

@zheng-da
Copy link
Contributor

@PistonY currently, MXNet doesn't support the second-order differentiation.
we might have it in a near future @sxjscience

@apeforest
Copy link
Contributor

@nswamy Please change the label type from Bug to Feature Request. thx

@PistonY
Copy link
Author

PistonY commented Sep 14, 2018

ok,hope mxnet support it soon.

@devymex
Copy link

devymex commented Oct 10, 2019

Do you have any updates for the request? I want to train wgan-gp with gluon.

@larroy
Copy link
Contributor

larroy commented Oct 10, 2019

Shouldn't my PR fix this issue?

Here 2nd order gradient for fully connected was implemented.
#14779

@larroy
Copy link
Contributor

larroy commented Oct 10, 2019

I think we could close this

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

9 participants