Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

remove .item which causes sync issues #1254

Merged
merged 3 commits into from
Mar 30, 2020
Merged

remove .item which causes sync issues #1254

merged 3 commits into from
Mar 30, 2020

Conversation

williamFalcon
Copy link
Contributor

@williamFalcon williamFalcon commented Mar 26, 2020

removes unecessary syncs in ddp caused by unecessary .item()

@Borda Borda added this to the 0.7.2 milestone Mar 27, 2020
@Borda Borda added the bug Something isn't working label Mar 27, 2020
@@ -0,0 +1,34 @@
import torch
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what about creating a metrics package?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not relevant to this

Copy link
Member

@ethanwharris ethanwharris left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM :)

@codecov
Copy link

codecov bot commented Mar 29, 2020

Codecov Report

Merging #1254 into master will increase coverage by 0%.
The diff coverage is 97%.

@@          Coverage Diff           @@
##           master   #1254   +/-   ##
======================================
  Coverage      91%     91%           
======================================
  Files          61      62    +1     
  Lines        3153    3176   +23     
======================================
+ Hits         2879    2901   +22     
- Misses        274     275    +1     

@williamFalcon williamFalcon merged commit 31b7148 into master Mar 30, 2020
alexeykarnachev pushed a commit to alexeykarnachev/pytorch-lightning that referenced this pull request Mar 30, 2020
* remove .item which causes sync issues

* fixed gradient acc sched

* fixed gradient acc sched
@Borda Borda deleted the sync_points branch March 30, 2020 06:26
alexeykarnachev pushed a commit to alexeykarnachev/pytorch-lightning that referenced this pull request Apr 1, 2020
* remove .item which causes sync issues

* fixed gradient acc sched

* fixed gradient acc sched
alexeykarnachev pushed a commit to alexeykarnachev/pytorch-lightning that referenced this pull request Apr 3, 2020
* remove .item which causes sync issues

* fixed gradient acc sched

* fixed gradient acc sched
alexeykarnachev pushed a commit to alexeykarnachev/pytorch-lightning that referenced this pull request Apr 3, 2020
* remove .item which causes sync issues

* fixed gradient acc sched

* fixed gradient acc sched
alexeykarnachev pushed a commit to alexeykarnachev/pytorch-lightning that referenced this pull request Apr 3, 2020
* remove .item which causes sync issues

* fixed gradient acc sched

* fixed gradient acc sched
Borda pushed a commit to alexeykarnachev/pytorch-lightning that referenced this pull request Apr 7, 2020
* remove .item which causes sync issues

* fixed gradient acc sched

* fixed gradient acc sched
williamFalcon added a commit that referenced this pull request Apr 8, 2020
* `add_argparse_args` method fixed (argument types added)

* autopep8 fixes

* --gpus=0 removed from test (for ci tests)

* Update pytorch_lightning/trainer/trainer.py

Co-Authored-By: Joe Davison <[email protected]>

* test_with_accumulate_grad_batches added

* agg_and_log_metrics logic added to the base logger class

* small format fix

* agg metrics strategies removed (not to complicate stuff)

* agg metrics: handle zero step

* autopep8

* changelog upd

* flake fix

* metrics aggregators factored out, metrics_agg.py added + tests

* metrics agg default value added

* Update pytorch_lightning/loggers/metrics_agg.py

Co-Authored-By: Jirka Borovec <[email protected]>

* metrics aggregators factored out, metrics_agg.py added + tests

* metrics agg default value added

* Update pytorch_lightning/loggers/metrics_agg.py

Co-Authored-By: Jirka Borovec <[email protected]>

* remove .item which causes sync issues (#1254)

* remove .item which causes sync issues

* fixed gradient acc sched

* fixed gradient acc sched

* test_metrics_agg.py removed (all tested in doctrings), agg metrics refactored

* test_metrics_agg.py removed (all tested in doctrings), agg metrics refactored

* autopep8

* loggers base.py types fixed

* test

* test

* metrics aggregation for loggers: each key now has a specific function (or default one)

* metrics aggregation for loggers: each key now has a specific function (or default one)

* docstrings upd

* manual typehints removed from docstrings

* batch_size decreased for test `test_with_accumulate_grad_batches`

* extend running accum

* refactor

* fix tests

* fix tests

* allowed_types generator scoped

* trainer.py distutils was imported twice, fixed

* TensorRunningAccum refactored

* TensorRunningAccum added to change log (Changed)

* change log pull link added

Co-authored-by: Joe Davison <[email protected]>
Co-authored-by: Jirka Borovec <[email protected]>
Co-authored-by: William Falcon <[email protected]>
Co-authored-by: J. Borovec <[email protected]>
alexeykarnachev pushed a commit to alexeykarnachev/pytorch-lightning that referenced this pull request Apr 10, 2020
* remove .item which causes sync issues

* fixed gradient acc sched

* fixed gradient acc sched
alexeykarnachev pushed a commit to alexeykarnachev/pytorch-lightning that referenced this pull request Apr 10, 2020
* remove .item which causes sync issues

* fixed gradient acc sched

* fixed gradient acc sched
tullie pushed a commit to tullie/pytorch-lightning that referenced this pull request Jun 7, 2020
…AI#1278)

* `add_argparse_args` method fixed (argument types added)

* autopep8 fixes

* --gpus=0 removed from test (for ci tests)

* Update pytorch_lightning/trainer/trainer.py

Co-Authored-By: Joe Davison <[email protected]>

* test_with_accumulate_grad_batches added

* agg_and_log_metrics logic added to the base logger class

* small format fix

* agg metrics strategies removed (not to complicate stuff)

* agg metrics: handle zero step

* autopep8

* changelog upd

* flake fix

* metrics aggregators factored out, metrics_agg.py added + tests

* metrics agg default value added

* Update pytorch_lightning/loggers/metrics_agg.py

Co-Authored-By: Jirka Borovec <[email protected]>

* metrics aggregators factored out, metrics_agg.py added + tests

* metrics agg default value added

* Update pytorch_lightning/loggers/metrics_agg.py

Co-Authored-By: Jirka Borovec <[email protected]>

* remove .item which causes sync issues (Lightning-AI#1254)

* remove .item which causes sync issues

* fixed gradient acc sched

* fixed gradient acc sched

* test_metrics_agg.py removed (all tested in doctrings), agg metrics refactored

* test_metrics_agg.py removed (all tested in doctrings), agg metrics refactored

* autopep8

* loggers base.py types fixed

* test

* test

* metrics aggregation for loggers: each key now has a specific function (or default one)

* metrics aggregation for loggers: each key now has a specific function (or default one)

* docstrings upd

* manual typehints removed from docstrings

* batch_size decreased for test `test_with_accumulate_grad_batches`

* extend running accum

* refactor

* fix tests

* fix tests

* allowed_types generator scoped

* trainer.py distutils was imported twice, fixed

* TensorRunningAccum refactored

* TensorRunningAccum added to change log (Changed)

* change log pull link added

Co-authored-by: Joe Davison <[email protected]>
Co-authored-by: Jirka Borovec <[email protected]>
Co-authored-by: William Falcon <[email protected]>
Co-authored-by: J. Borovec <[email protected]>
@Borda Borda modified the milestones: v0.7., v0.7.x Apr 18, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants