[RFC] Deprecate amp_level
in favour of ApexMixedPrecisionPlugin(amp_level=...)
#12655
Labels
Milestone
amp_level
in favour of ApexMixedPrecisionPlugin(amp_level=...)
#12655
Proposed refactor
Deprecate
amp_level
and let users choose a differentamp_level
by passingTrainer(..., plugins=[ApexMixedPrecisionPlugin(amp_level=...)]
.Previous discussion: #9956 (comment)
Motivation
Currently,
amp_level
is used only whenprecision=16
andamp_backend='apex'
i.e it's only relevant for only one type of configuration and is generally not required.Users can easily choose it by setting
Trainer(..., plugins=[ApexMixedPrecisionPlugin(amp_level=...)]
which IMO makes more sense since it now reflects that amp_level is only relevant forApexMixedPrecisionPlugin
.Pitch
Deprecate in v1.6 and remove in v1.8.
Additional context
If you enjoy Lightning, check out our other projects! ⚡
Metrics: Machine learning metrics for distributed, scalable PyTorch applications.
Lite: enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.
Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.
Bolts: Pretrained SOTA Deep Learning models, callbacks, and more for research and production with PyTorch Lightning and PyTorch.
Lightning Transformers: Flexible interface for high-performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.
cc @justusschock @awaelchli @rohitgr7 @tchaton @kaushikb11 @Borda
The text was updated successfully, but these errors were encountered: