Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Deprecate add_to_queue / get_from_queue #8940

Closed
daniellepintz opened this issue Aug 16, 2021 · 7 comments · Fixed by #9118
Closed

Deprecate add_to_queue / get_from_queue #8940

daniellepintz opened this issue Aug 16, 2021 · 7 comments · Fixed by #9118
Assignees
Labels
deprecation Includes a deprecation design Includes a design discussion feature Is an improvement or enhancement help wanted Open to be worked on let's do it! approved to implement

Comments

@daniellepintz
Copy link
Contributor

daniellepintz commented Aug 16, 2021

🚀 Feature

Deprecate these two functions from the lightning interface:
https://github.com/PyTorchLightning/pytorch-lightning/blob/4b6aaeeae3c53c45cf31a7fc2cf90133acfe5d12/pytorch_lightning/core/lightning.py#L1989-L2014

Motivation

Both of these hooks are hyper-specific to transferring data in a spawned manner. They are not related to ML code at all and should not be on the lightning module interface

more discussion here: https://docs.google.com/document/d/1xHU7-iQSpp9KJTjI3As2EM0mfNHHr37WZYpDpwLkivA/edit?disco=AAAANaZ8IYw

This is part of a larger initiative to separate Trainer and lightning module #7315, as part of the Lightning Architecture Review #7740

Pitch

  • Deprecate this method in v1.5
  • Remove this method in v1.7

Alternatives

Additional context


If you enjoy Lightning, check out our other projects! ⚡

  • Metrics: Machine learning metrics for distributed, scalable PyTorch applications.

  • Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, finetuning and solving problems with deep learning

  • Bolts: Pretrained SOTA Deep Learning models, callbacks and more for research and production with PyTorch Lightning and PyTorch

  • Lightning Transformers: Flexible interface for high performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.

@daniellepintz daniellepintz added feature Is an improvement or enhancement help wanted Open to be worked on labels Aug 16, 2021
@ananthsub ananthsub added the design Includes a design discussion label Aug 16, 2021
@tchaton
Copy link
Contributor

tchaton commented Aug 16, 2021

Hey @daniellepintz,

Even if I agree with the fact they are quite specific, they were introduced to resolve a pretty important use-case related Optuna Integration.

Therefore, I believe we should propose a better alternative before starting a depreciation.

Best,
T.C

@ananthsub
Copy link
Contributor

Even if I agree with the fact they are quite specific, they were introduced to resolve a pretty important use-case related Optuna Integration.

out of curiosity, do you have a pointer to the optuna integration?

@daniellepintz
Copy link
Contributor Author

thanks for your input @tchaton! could you share more details on the use case so we can see if there is a better alternative here?

@tchaton
Copy link
Contributor

tchaton commented Aug 18, 2021

Dear @daniellepintz,

Here are the issues related to those addition: #7916 and #7671.

When using ddp_spawn, all components get modified within the spawned processes and therefore, when spawn context finishes, all components aren't updated.
We currently update the weights, the last path, the results and the callback metrics: https://github.com/PyTorchLightning/pytorch-lightning/blob/master/pytorch_lightning/plugins/training_type/ddp_spawn.py#L307

However, users might want to bring more things from the spawned processes through the queue and those hooks were introduced for this use-case.

For Optuna relying on ddp_spawn, the callback_metrics were required, which is why the default hooks attach the callback_metrics: https://github.com/PyTorchLightning/pytorch-lightning/blob/5329b0d11358b72a42d596d3b70b7010e35c45af/pytorch_lightning/core/lightning.py#L1957

Unfortunately, I don't have a better solution for now as it can be model specific.

@ananthsub
Copy link
Contributor

@tchaton - as this was introduced recently in DDP spawn, we can by default transfer the callback metrics inside of the plugin. This wouldn't have to be exposed to the user, and would still meet the Optuna requirements.

if users want to bring more things from the spawned process, shouldn't they be able to extend the DDP spawn plugin to customize this? Is this customization specific to the spawn plugins?

In some ways, this feels very similar to #8742 , as the lightning module is used for customization when we could offer this customization at the corresponding component (here ddp spawn plugin, there progress bar)

@tchaton
Copy link
Contributor

tchaton commented Aug 19, 2021

Hey @ananthsub.

I think we could maybe move those hooks directly the DDPSpawnPlugin instead and users could easily override them.

Best,
T.C

@awaelchli awaelchli added the deprecation Includes a deprecation label Aug 19, 2021
@daniellepintz
Copy link
Contributor Author

sounds good, I will submit a PR to move these hooks to DDPSpawnPlugin! thanks everyone

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
deprecation Includes a deprecation design Includes a design discussion feature Is an improvement or enhancement help wanted Open to be worked on let's do it! approved to implement
Projects
None yet
4 participants