Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Useful abstraction for metrics tracker metrics #180

Closed
rajs96 opened this issue Apr 19, 2021 · 7 comments · Fixed by #238
Closed

Useful abstraction for metrics tracker metrics #180

rajs96 opened this issue Apr 19, 2021 · 7 comments · Fixed by #238
Labels
enhancement New feature or request help wanted Extra attention is needed New metric

Comments

@rajs96
Copy link
Contributor

rajs96 commented Apr 19, 2021

🚀 Feature

Integrating timeseries metrics. We can create an abstraction, or class, that should contain the metrics recorded in a time series.

Motivation

At work, we've often been interested in seeing how metrics besides the loss change throughout the training process - using a designated object with built-in functionalities would save the time of having to implement these functionalities ourselves. For example, we may want to plot metric values throughout training, return the max metric throughout the training for each metric, average timeseries metrics from multiple models for cross validation, etc.

Pitch

We should create a class that makes it easy to add timesteps of Metric modules (or MetricCollections), and add in useful functionalities for the class. Refer to the following pseudocode:

    ts = TimeSeriesMetrics(MetricCollection(Accuracy(), Precision(), Recall(), F1())
    for _ in range(epochs):
        model.train_epoch()
        preds = model(data)
        ts.add_timestep(preds, targets)
    
    # get best f1 score and the associated timestep
    best_timestep, best_f1 = ts.best_metric('f1', return_timestep=True)
    
    # get all precision scores in order
    precision_scores = ts.all_metrics('precision')

    # given multiple timeseries metric objects, average along the timeseries axis
    # for example, if you were doing cross validation
    ts_metric_avg = timeseries_avg([TimeSeriesMetrics1, ...])

Alternatives

We could theoretically just have the user maintain a Python List of Metric Modules (or dicts computed from these modules) and implement functions to act on these lists, rather than creating a class.

Additional context

I came across these problems / ideas when training transformer models for 15-20 epochs with cross validation.

@rajs96 rajs96 added enhancement New feature or request help wanted Extra attention is needed labels Apr 19, 2021
@github-actions
Copy link

Hi! thanks for your contribution!, great first issue!

@Borda
Copy link
Member

Borda commented Apr 19, 2021

@rajs96, it sounds a great addition to this package, are you interested in implantation this?
🐰 it would be a highlight for the next reease!
cc: @PyTorchLightning/core-metrics

@Borda Borda added this to the v0.4 milestone Apr 19, 2021
@rajs96
Copy link
Contributor Author

rajs96 commented Apr 19, 2021

@rajs96, it sounds a great addition to this package, are you interested in implantation this?
🐰 it would be a highlight for the next reease!
cc: @PyTorchLightning/core-metrics

Yup - what are next steps then?

@Borda
Copy link
Member

Borda commented Apr 26, 2021

I would say make an abstract/base class for this metric group and then we will particular metrics in separate PRs...
btw, ping us on SLack so we can easier coordinate :]

@Borda
Copy link
Member

Borda commented May 3, 2021

@rajs96 how is it going, do you think we can make some initial draft about this week?

@rajs96
Copy link
Contributor Author

rajs96 commented May 4, 2021

@rajs96 how is it going, do you think we can make some initial draft about this week?

I can start working on a PR draft this weekend - would that work?

@Borda
Copy link
Member

Borda commented May 4, 2021

I can start working on a PR draft this weekend - would that work?

Sooner is better so we have more time to fine tuning it =)

@Borda Borda mentioned this issue May 10, 2021
4 tasks
@Borda Borda changed the title Useful abstraction for timeseries metrics Useful abstraction for metrics tracker metrics Jun 16, 2021
@Borda Borda unpinned this issue Jul 28, 2021
@SkafteNicki SkafteNicki linked a pull request Aug 3, 2021 that will close this issue
4 tasks
@Borda Borda modified the milestones: v0.4, v0.5 Aug 3, 2021
@Borda Borda closed this as completed in #238 Aug 4, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request help wanted Extra attention is needed New metric
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants