Measure a metric over the entire test dataset #464
Answered
by
celsofranssa
celsofranssa
asked this question in
Classification
-
Hello, I normally apply TorchMetrics combined with PL as following: class PLModel(pl.LightningModule):
def __init__(self, hparams):
super(PLModel, self).__init__()
self.save_hyperparameters(hparams)
self.encoder = instantiate(hparams.encoder)
self.cls_head = torch.nn.Sequential(
torch.nn.Dropout(hparams.dropout),
torch.nn.Linear(hparams.hidden_size, hparams.num_classes),
torch.nn.LogSoftmax(dim=-1)
)
# metrics
self.val_metrics = self._get_metrics(prefix="test_")
...
def _get_metrics(self, prefix):
return MetricCollection(
metrics={
"Mic-F1": F1(num_classes=self.hparams.num_classes, average="micro"),
"Wei-F1": F1(num_classes=self.hparams.num_classes, average="weighted")
},
prefix=prefix)
...
def test_step(self, batch, batch_idx):
text, true_cls = batch["text"], batch["cls"]
pred_cls = self.cls_head(
self(text)
)
# log text metrics
self.log_dict(self.test_metrics(pred_cls, true_cls), prog_bar=True) In this scenario, what is the proper approach to calculate the |
Beta Was this translation helpful? Give feedback.
Answered by
celsofranssa
Aug 18, 2021
Replies: 1 comment 4 replies
-
It is automatic done when the def test_epoch_end(self, outs):
self.test_metrics.compute() |
Beta Was this translation helpful? Give feedback.
4 replies
Answer selected by
SkafteNicki
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
It is automatic done when the
test_epoch_end
is implemented in PL?