-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PrecisionPlugin should be the source of truth for the precision, amp_backend, and amp_level #11814
Comments
@ananthsub I see the precision classes already have the class attribute |
@daniellepintz With #12069 the backend would be deprecated :) |
I submitted a PR for the first part of this - would appreciate feedback. Was planning to remove the PrecisionType and AMPType enums in a separate PR but let me know if it would be better all in one PR. |
@ananthsub I see in #12069 we are deprecating the Given this, does it still make sense to add a |
To recap here:
I don't think there's anything else to decide here, feel free to re-open if I missed something. Otherwise let's continue in the linked open issues. |
Proposed refactor
Motivation
For these to truly be pluggable, the precision plugin instance should be the source of truth for these properties within the Trainer. This has many benefits:
Avoids data falling out of sync. Right now, this is a bug because it does't consider precision=="mixed" or "bf16"
https://github.com/PyTorchLightning/pytorch-lightning/blob/32e7d32956e1685d36f2ab0ca3770baa2f76ce10/pytorch_lightning/trainer/trainer.py#L2111-L2114
Configuration checks can be done at the individual precision plugin implementation. This removes the framework from needing to stipulate enums for this (enums don't make sense if people can pass in custom classes)
https://github.com/PyTorchLightning/pytorch-lightning/blob/32e7d32956e1685d36f2ab0ca3770baa2f76ce10/pytorch_lightning/utilities/enums.py#L71-L104
If I want to try out 4-bit precision, I should be able to without needing to get it added to this enum.
This also simplifies the monolothic accelerator_connector file which does all of this validation today:
This means each individual plugin can validate the configuration used. This removes the framework needing to specify enums
Pitch
Add
precision
,amp_backend
, andamp_level
as attributes to the PrecisionPlugin base.Additional context
If you enjoy Lightning, check out our other projects! ⚡
Metrics: Machine learning metrics for distributed, scalable PyTorch applications.
Lite: enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.
Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.
Bolts: Pretrained SOTA Deep Learning models, callbacks, and more for research and production with PyTorch Lightning and PyTorch.
Lightning Transformers: Flexible interface for high-performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.
cc @justusschock @awaelchli @akihironitta @rohitgr7 @carmocca
The text was updated successfully, but these errors were encountered: