Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove silent behavior when num_slurm_tasks does not correspond to number of processes in Trainer #14300

Merged
merged 21 commits into from
Sep 16, 2022

Conversation

awaelchli
Copy link
Contributor

@awaelchli awaelchli commented Aug 18, 2022

What does this PR do?

In our AcceleratorConnector, we currently have logic that disables the SLURMEnvironment when the number of processes does not correspond to the number of processes registered with SLURM, i.e., the following logic:

total_requested_devices = len(self._parallel_devices) * self._num_nodes_flag
num_slurm_tasks = int(os.environ["SLURM_NTASKS"], 0)
should_use_slurm = num_slurm_tasks == total_requested_devices

if should_use_slurm:
   env = SLURMEnvironment()

This logic has been there since the beginning of Lightning, but is extremely error-prone. For example, if the user forgets to set the num_nodes flag on the Trainer to the right value, or set it at all, then this can lead to hanging processes because Lightning expects a different number of processes than provided by SLURM.

Why was it there in the first place?
The use case here is that the user wants to let their SLURM manager provision the node, but does not want to let SLURM launch the Python processes. Instead, user wants to launch processes themselves. In this case, the user can just do Trainer(plugins=LightningEnvironment()). This way, even when in a Slurm environment, the trainer will not auto-detect slurm and user can do what they want.

Also related to #14078

Fixes #13605

Does your PR introduce any breaking changes? If yes, please list them.

Yes, but the previous behavior was undocumented and in most cases lead to a surprise of the user.

Before submitting

  • Was this discussed/approved via a GitHub issue? (not for typos and docs)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes? (if necessary)
  • Did you write any new necessary tests? (not for typos and docs)
  • Did you verify new and existing tests pass locally with your changes?
  • Did you update the CHANGELOG? (not for typos, docs, test updates, or internal minor changes/refactorings)

PR review

Anyone in the community is free to review the PR once the tests have passed.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:

  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified

Did you have fun?

I made sure I had fun coding 🙃

cc @awaelchli

@github-actions github-actions bot added the pl Generic label for PyTorch Lightning package label Aug 18, 2022
@awaelchli awaelchli added this to the pl:1.8 milestone Aug 18, 2022
@awaelchli awaelchli self-assigned this Aug 18, 2022
@awaelchli awaelchli mentioned this pull request Aug 19, 2022
11 tasks
@awaelchli awaelchli marked this pull request as ready for review August 27, 2022 23:22
@awaelchli awaelchli requested a review from otaj as a code owner August 27, 2022 23:25
@codecov
Copy link

codecov bot commented Aug 28, 2022

Codecov Report

Merging #14300 (89adc4b) into master (89adc4b) will not change coverage.
The diff coverage is n/a.

❗ Current head 89adc4b differs from pull request most recent head 5326aa3. Consider uploading reports for the commit 5326aa3 to get more accurate results

@@           Coverage Diff           @@
##           master   #14300   +/-   ##
=======================================
  Coverage      62%      62%           
=======================================
  Files         396      396           
  Lines       28879    28879           
=======================================
  Hits        17774    17774           
  Misses      11105    11105           

@mergify mergify bot removed the has conflicts label Sep 15, 2022
@mergify mergify bot added the ready PRs ready to be merged label Sep 15, 2022
@Borda Borda enabled auto-merge (squash) September 15, 2022 17:26
@Borda Borda merged commit 619e76f into master Sep 16, 2022
@Borda Borda deleted the feature/slurm-bash branch September 16, 2022 11:00
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
environment: slurm pl Generic label for PyTorch Lightning package ready PRs ready to be merged
Projects
No open projects
Status: Done
Development

Successfully merging this pull request may close these issues.

_is_slurm_managing_tasks(self) incorrectly returns False when using SLURM's --gpu-bind=closest
3 participants