-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Accelerator Refactor: Precision Plugins #5718
Conversation
Hello @justusschock! Thanks for updating this PR.
Comment last updated at 2021-01-31 17:46:16 UTC |
- CPU | ||
- GPU |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@SeanNaren if we want to do ZeRO-v3 and offload more to CPU memory, is that handled within the training type plugin entirely? is there any dependency on the accelerator in which the plugin is contained?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah handled explicitly within the training type plugin, or more technically, within the sharded DDP/OSS class itself. I think from what I recall, an offload_device
will be exposed giving the user control (in this case, our accelerator control)
pytorch_lightning/plugins/training_type/training_type_plugin.py
Outdated
Show resolved
Hide resolved
from pytorch_lightning.plugins.precision.precision_plugin import PrecisionPlugin | ||
|
||
|
||
class TPUHalfPrecisionPlugin(PrecisionPlugin): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ooo I didn't see this, this is nice :)
Co-Authored with @awaelchi
Co-Authored with @awaelchi
Co-authored-by: Adrian Wälchli <[email protected]>
Co-authored-by: Adrian Wälchli <[email protected]>
Co-authored-by: Adrian Wälchli <[email protected]>
Co-authored-by: Adrian Wälchli <[email protected]>
Co-authored-by: Adrian Wälchli <[email protected]>
Co-authored-by: Adrian Wälchli <[email protected]>
Co-authored-by: Adrian Wälchli <[email protected]>
b202ddf
to
94e0b28
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lets fix import...
Co-authored-by: Jirka Borovec <[email protected]>
…ytorch-lightning into ref/precision_plugins
Codecov Report
@@ Coverage Diff @@
## release/1.2-dev #5718 +/- ##
================================================
- Coverage 89% 89% -0%
================================================
Files 173 173
Lines 12495 12339 -156
================================================
- Hits 11175 11017 -158
- Misses 1320 1322 +2 |
What does this PR do?
Adds precision plugins from #5616
To be merged after #5715
Co-Authored with @awaelchli