-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[sharded plugin] Fix check for fp16 precision #7825
Merged
awaelchli
merged 57 commits into
Lightning-AI:master
from
shuyingsunshine21:fix_precision_bug
Jun 4, 2021
Merged
[sharded plugin] Fix check for fp16 precision #7825
awaelchli
merged 57 commits into
Lightning-AI:master
from
shuyingsunshine21:fix_precision_bug
Jun 4, 2021
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Summary: Test Plan: Reviewers: Subscribers: Tasks: Tags:
…oint_consolidate Update test_all_gather_grad.py
This reverts commit 9d4a2b8.
This reverts commit 0d23d75.
This reverts commit 70fe5da.
This reverts commit a9aae99.
This reverts commit ea74906.
This reverts commit bf70e43.
This reverts commit f172101.
This reverts commit 536c132.
This reverts commit 3a9fde9.
This reverts commit 7a369f4.
This reverts commit 8222dc9.
This reverts commit 6c095b2.
This reverts commit 250d0aa.
This reverts commit 8651d54.
This reverts commit dcdcd29.
ananthsub
reviewed
Jun 3, 2021
@@ -54,7 +54,7 @@ def _reinit_optimizers_with_oss(self): | |||
optim_class = type(optimizer) | |||
zero_optimizer = OSS(params=optimizer.param_groups, optim=optim_class, **optimizer.defaults) | |||
if _FAIRSCALE_OSS_FP16_BROADCAST_AVAILABLE: | |||
is_fp16 = self.lightning_module.trainer.precision == 16 | |||
is_fp16 = self.lightning_module.trainer.precision == "mixed" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Suggested change
is_fp16 = self.lightning_module.trainer.precision == "mixed" | |
precision = self.lightning_module.trainer.precision | |
is_fp16 = precision == "mixed" or precision == 16 |
just in case we enable true fp16 later
ananthsub
approved these changes
Jun 4, 2021
justusschock
approved these changes
Jun 4, 2021
auto-merge was automatically disabled
June 4, 2021 05:52
Head branch was pushed to by a user without write access
would like to add unittest for assert the wrapped optimizer's attribute |
awaelchli
approved these changes
Jun 4, 2021
Closed
SeanNaren
pushed a commit
that referenced
this pull request
Jun 8, 2021
Co-authored-by: Carlos Mocholí <[email protected]> Co-authored-by: ananthsub <[email protected]> (cherry picked from commit ca89a7f)
lexierule
pushed a commit
that referenced
this pull request
Jun 9, 2021
Co-authored-by: Carlos Mocholí <[email protected]> Co-authored-by: ananthsub <[email protected]> (cherry picked from commit ca89a7f)
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
What does this PR do?
Fixes the bug, precision passed in trainer is 16, but PrecisionPlugin will convert this to other format, like
MixedPrecisionPlugin.precision
is mixed (https://github.com/PyTorchLightning/pytorch-lightning/blob/master/pytorch_lightning/plugins/precision/mixed.py)Before submitting
PR review
Anyone in the community is free to review the PR once the tests have passed.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:
Did you have fun?
Make sure you had fun coding 🙃