Skip to content

Commit

Permalink
Fix DPP + SyncBN
Browse files Browse the repository at this point in the history
Ensure that model is already on correct GPU before applying SyncBN conversion
  • Loading branch information
BloodAxe authored Apr 5, 2021
1 parent 22a266d commit a168079
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions pytorch_lightning/plugins/training_type/ddp.py
Original file line number Diff line number Diff line change
Expand Up @@ -241,12 +241,12 @@ def init_ddp_connection(self, global_rank: int, world_size: int) -> None:
torch_distrib.init_process_group(self.torch_distributed_backend, rank=global_rank, world_size=world_size)

def pre_dispatch(self):
if self.sync_batchnorm:
self.model = self.configure_sync_batchnorm(self.model)

# move the model to the correct device
self.model_to_device()

if self.sync_batchnorm:
self.model = self.configure_sync_batchnorm(self.model)

self.configure_ddp()

self.barrier()
Expand Down

0 comments on commit a168079

Please sign in to comment.