Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Compatibility with Pytorch 2.0; failing test test_gradient_value #1124

Open
h-vetinari opened this issue May 13, 2023 · 4 comments
Open

Compatibility with Pytorch 2.0; failing test test_gradient_value #1124

h-vetinari opened this issue May 13, 2023 · 4 comments

Comments

@h-vetinari
Copy link
Contributor

I'm seeing one test failure in conda-forge/fairscale-feedstock#28 when built against pytorch 2.0

=================================== FAILURES ===================================
_____________________________ test_gradient_value ______________________________

    def test_gradient_value():
        """Test that we don't mutate the gradients during backward"""
        model = Linear(2, 2, bias=False)
        optim = AdaScale(SGD(model.parameters(), lr=0.1), num_gradients_to_accumulate=2)
    
        # fwd 1
        out = model(Tensor([0.0, 1.0]))
        out.sum().backward()
        assert np.allclose(model.weight.grad.numpy(), [[0.0, 1.0], [0.0, 1.0]]), model.weight.grad
    
        # fwd 2, grad is accumulated
        out = model(Tensor([0.0, 1.0]))
        out.sum().backward()
        assert np.allclose(model.weight.grad.numpy(), [[0.0, 2.0], [0.0, 2.0]]), model.weight.grad
    
        # assert gain and grad value before/after step/zero_grad
        assert np.allclose(optim.gain(), 1.0000002499999376), optim.gain()
        optim.step()
        assert np.allclose(model.weight.grad.numpy(), [[0.0, 2.0], [0.0, 2.0]]), model.weight.grad
        optim.zero_grad()
>       assert np.allclose(model.weight.grad.numpy(), [[0.0, 0.0], [0.0, 0.0]]), model.weight.grad
E       AttributeError: 'NoneType' object has no attribute 'numpy'

Looks like model.weight.grad becomes None somehow.

@h-vetinari
Copy link
Contributor Author

Gentle ping @min-xu-ai

@min-xu-ai
Copy link
Contributor

Thanks for the ping. Sorry that I am no longer a committor or maintainer for this repo.

@h-vetinari
Copy link
Contributor Author

Sorry to hear it (or perhaps: happy for you? depending on how things turned out...). In any case, who would be your successor then?

@min-xu-ai
Copy link
Contributor

min-xu-ai commented Jun 3, 2023

Unfortunately I don’t know that information since I am no longer with fb

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants