-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TODOs for achieving stability for Trainer Strategy API #9932
Comments
12 tasks
This was referenced Oct 14, 2021
Do we want to also rename DeviceType here to AcceleratorType? |
Added to the list. |
11 tasks
This was referenced Nov 12, 2021
kaushikb11
changed the title
TODOs after addition of the
TODOs for achieving stability for Trainer Strategy API
Nov 16, 2021
strategy
argument
This was referenced Jan 5, 2022
Repository owner
moved this from In Progress
to Done
in Frameworks Planning
Mar 22, 2022
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Proposed refactoring or deprecation
Motivation
We have added the
strategy
argument for Trainer in this PR.The next immediate refactors required are:
Post the above refactors, we will be going through the below internal updates:
strategy
value.DistributedType
in favor ofStrategyType
#10505DeviceType
in favor of_AcceleratorType
#10503distributed_backend
fromTrainer
#10017self.distributed_backend
references.Strategy
in favour ofTrainingTypePlugin
#10548tests/accelerators/test_accelerator_connector.py
that test for the deprecated usage ofaccelerator="strategy_name"
If you enjoy Lightning, check out our other projects! ⚡
Metrics: Machine learning metrics for distributed, scalable PyTorch applications.
Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, finetuning and solving problems with deep learning
Bolts: Pretrained SOTA Deep Learning models, callbacks and more for research and production with PyTorch Lightning and PyTorch
Lightning Transformers: Flexible interface for high performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.
The text was updated successfully, but these errors were encountered: