Skip to content

TorchMetrics, Pytorch Lightning and DataParallel #528

Answered by SkafteNicki
aretor asked this question in CompVision
Discussion options

You must be logged in to vote

No, in ddp it should not be necessary. You still need to sync the metric at some point between the different devices, which is automatically done when metric.compute() is called. Therefore, something like this should still work:

def training_step(self, batch, batch_idx):
    ...
    self.metric.update(preds, target)
    ...

def training_epoch_end(self, outputs)
    val = self.metric.compute()  # this will sync the metric between devices
    self.log("metric", val)
    self.metric.reset()

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by aretor
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
2 participants