Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[RFC] Deprecate amp_level in favour of ApexMixedPrecisionPlugin(amp_level=...) #12655

Closed
rohitgr7 opened this issue Apr 7, 2022 · 1 comment · Fixed by #13898
Closed

[RFC] Deprecate amp_level in favour of ApexMixedPrecisionPlugin(amp_level=...) #12655

rohitgr7 opened this issue Apr 7, 2022 · 1 comment · Fixed by #13898
Labels
deprecation Includes a deprecation help wanted Open to be worked on refactor trainer: argument
Milestone

Comments

@rohitgr7
Copy link
Contributor

rohitgr7 commented Apr 7, 2022

Proposed refactor

Deprecate amp_level and let users choose a different amp_level by passing Trainer(..., plugins=[ApexMixedPrecisionPlugin(amp_level=...)].

Previous discussion: #9956 (comment)

Motivation

Currently, amp_level is used only when precision=16 and amp_backend='apex' i.e it's only relevant for only one type of configuration and is generally not required.

Users can easily choose it by setting Trainer(..., plugins=[ApexMixedPrecisionPlugin(amp_level=...)] which IMO makes more sense since it now reflects that amp_level is only relevant for ApexMixedPrecisionPlugin.

Pitch

Deprecate in v1.6 and remove in v1.8.

Additional context


If you enjoy Lightning, check out our other projects! ⚡

  • Metrics: Machine learning metrics for distributed, scalable PyTorch applications.

  • Lite: enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.

  • Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.

  • Bolts: Pretrained SOTA Deep Learning models, callbacks, and more for research and production with PyTorch Lightning and PyTorch.

  • Lightning Transformers: Flexible interface for high-performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.

cc @justusschock @awaelchli @rohitgr7 @tchaton @kaushikb11 @Borda

@rohitgr7 rohitgr7 added refactor deprecation Includes a deprecation labels Apr 7, 2022
@rohitgr7 rohitgr7 added this to the 1.7 milestone Apr 7, 2022
@carmocca carmocca modified the milestones: pl:1.7, pl:future Jul 19, 2022
@carmocca carmocca added the help wanted Open to be worked on label Jul 19, 2022
@awaelchli
Copy link
Contributor

I am in favor of this. Note this was also proposed in #11814, but I think the precision argument should stay on the Trainer. Hence I prefer this proposal here more over the aggressive one in #11814.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
deprecation Includes a deprecation help wanted Open to be worked on refactor trainer: argument
Projects
No open projects
Status: Accepted
Development

Successfully merging a pull request may close this issue.

3 participants