Skip to content

Commit

Permalink
update link in note
Browse files Browse the repository at this point in the history
Co-authored-by: Carlos Mocholí <[email protected]>
  • Loading branch information
2 people authored and Borda committed Jan 13, 2021
1 parent 60d8fe5 commit be8e11e
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions pytorch_lightning/overrides/data_parallel.py
Original file line number Diff line number Diff line change
Expand Up @@ -200,8 +200,8 @@ def forward(self, *inputs, **kwargs):


# In manual_optimization, we need to call reducer prepare_for_backward.
# TODO: Keep track of Pytorch DDP and update if there is a change
# https://github.com/pytorch/pytorch/blob/e6779d4357ae94cc9f9fedb83a87eb6126016769/torch/nn/parallel/distributed.py#L692
# Note: Keep track of Pytorch DDP and update if there is a change
# https://github.com/pytorch/pytorch/blob/v1.7.1/torch/nn/parallel/distributed.py#L626-L638
def prepare_for_backward(model: DistributedDataParallel, output: Any):
if torch.is_grad_enabled() and model.require_backward_grad_sync:
model.require_forward_param_sync = True
Expand Down

0 comments on commit be8e11e

Please sign in to comment.