Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support DDPPlugin to be used on CPU #6208

Merged
merged 59 commits into from
Jul 2, 2021
Merged

Support DDPPlugin to be used on CPU #6208

merged 59 commits into from
Jul 2, 2021

Conversation

awaelchli
Copy link
Contributor

@awaelchli awaelchli commented Feb 25, 2021

What does this PR do?

Before 1.2, the setting accelerator="ddp_cpu" would use the DDPSpawnAccelerator and today it is the DDPSpawnPlugin.
However, after the refactor we can also support DDPPlugin (non-spawn) to be used with cpu.

This can be interpreted both as a bugfix or a added feature.

# this always worked, still works, would use spawn method
Trainer(accelerator="ddp_cpu", num_processes=2)

# this now works too, uses spawn method
Trainer(accelerator="ddp_cpu", num_processes=2, plugins=[DDPSpawnPlugin(find_unused_parameters=True)])

# this now works, uses subprocess ddp
Trainer(accelerator="ddp_cpu", num_processes=2, plugins=[DDPPlugin(find_unused_parameters=True)])

Fixes #6121
Related #7810

Introduces a new conftest fixture to all tests to address the following problem:
All ddp tests that init a process group have global side effects, since the process group is a global state in the pytorch library. It means the process group can carry over from one test to the other. This leads to problems when the world size changes or the distributed backend changes from e.g. gloo to nccl, leading to broken pipe and hangs. We address this by force killing the process group after every test. A different teardown approach will be explored in #8080.

Additionally, some test refactoring changes are necessary.
"Special tests" execute one new process per test, but this includes the parameterization. This is not good. We want that each parameterization is it's own new independent run. Otherwise we will just end up with subprocesses trying to launch new processes.

Before submitting

  • Was this discussed/approved via a GitHub issue? (not for typos and docs)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes? (if necessary)
  • Did you write any new necessary tests? (not for typos and docs)
  • Did you verify new and existing tests pass locally with your changes?
  • Did you update the CHANGELOG? (not for typos, docs, test updates, or internal minor changes/refactorings)

PR review

Anyone in the community is free to review the PR once the tests have passed.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:

  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified

Did you have fun?

No, not on this one.

@awaelchli awaelchli added bug Something isn't working distributed Generic distributed-related topic labels Feb 25, 2021
@awaelchli awaelchli added this to the 1.2.x milestone Feb 25, 2021
@codecov
Copy link

codecov bot commented Feb 25, 2021

Codecov Report

Merging #6208 (5b40e21) into master (baa7de2) will decrease coverage by 5%.
The diff coverage is 100%.

@@           Coverage Diff           @@
##           master   #6208    +/-   ##
=======================================
- Coverage      93%     88%    -5%     
=======================================
  Files         212     212            
  Lines       13701   13697     -4     
=======================================
- Hits        12741   12054   -687     
- Misses        960    1643   +683     

@awaelchli awaelchli added the priority: 0 High priority task label Feb 27, 2021
@awaelchli awaelchli marked this pull request as ready for review February 27, 2021 01:05
Copy link
Member

@justusschock justusschock left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like it. Can we maybe raise a Warning for

 # this always worked, still works, would use spawn method
Trainer(accelerator="ddp_cpu", num_processes=2)

That this behavior might change in the future in favor of ddp and people should use

Trainer(accelerator="ddp_cpu_spawn", num_processes=2)

instead?

Also in

# this now works too, uses spawn method
Trainer(accelerator="ddp_cpu", num_processes=2, plugins=[DDPSpawnPlugin(find_unused_parameters=True)])

# this now works, uses subprocess ddp
Trainer(accelerator="ddp_cpu", num_processes=2, plugins=[DDPPlugin(find_unused_parameters=True)])

specifying the accelerator is not necessary when specifying the plugin and vice versa, right? So when I do Trainer(num_processes=2, plugins=[DDPPlugin()]) this should work if I don't specify any GPUs?

@mergify mergify bot removed the has conflicts label Feb 27, 2021
tests/special_tests.sh Outdated Show resolved Hide resolved
@awaelchli awaelchli changed the title Support DDPPlugin to be used with CPU Support DDPPlugin to be used with CPU [WIP] Feb 28, 2021
@Borda
Copy link
Member

Borda commented Mar 20, 2021

@awaelchli still in progress?

@mergify mergify bot removed the has conflicts label Mar 20, 2021
@awaelchli
Copy link
Contributor Author

I have some problems with RPC tests failing, when I am able to fix the issue, it will be ready for review.

@edenlightning edenlightning modified the milestones: v1.4, v1.3.x Jul 1, 2021
@Borda Borda requested review from kaushikb11 and tchaton July 1, 2021 21:06
@mergify mergify bot added the has conflicts label Jul 1, 2021
@mergify mergify bot removed the has conflicts label Jul 1, 2021
@carmocca carmocca removed the ready PRs ready to be merged label Jul 1, 2021
@carmocca
Copy link
Contributor

carmocca commented Jul 1, 2021

Removing "Ready to go" as tag as there are still some hangs :(

Copy link
Contributor

@tchaton tchaton left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM !

@tchaton tchaton merged commit e7139ab into master Jul 2, 2021
@tchaton tchaton deleted the bugfix/ddp_cpu branch July 2, 2021 11:00
@carmocca carmocca mentioned this pull request Aug 2, 2021
9 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working distributed Generic distributed-related topic priority: 0 High priority task
Projects
None yet
Development

Successfully merging this pull request may close these issues.

DDPCPU not woking with DDPPlugin(find_unused_parameters=True) in Ver 1.2.0
7 participants