Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix auto scaling mode when calling tune method on trainer. #7321

Merged
merged 5 commits into from
May 4, 2021
Merged

Fix auto scaling mode when calling tune method on trainer. #7321

merged 5 commits into from
May 4, 2021

Conversation

ramonemiliani93
Copy link
Contributor

@ramonemiliani93 ramonemiliani93 commented May 3, 2021

What does this PR do?

Fixes #7319

Adding a test that should pass if an incorrect mode is used. The test also works to show that passing binsearch has no effect on the mode that is being run on the auto scaling. Not sure if the best way would be to update the scale_batch_size_kwargs in the tune method of the Trainer.

Before submitting

  • Was this discussed/approved via a GitHub issue? (not for typos and docs)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes? (if necessary)
  • Did you write any new necessary tests? (not for typos and docs)
  • Did you verify new and existing tests pass locally with your changes?
  • Did you update the CHANGELOG? (not for typos, docs, test updates, or internal minor changes/refactorings)

PR review

Anyone in the community is free to review the PR once the tests have passed.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:

  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified

Did you have fun?

Make sure you had fun coding 🙃

@codecov
Copy link

codecov bot commented May 3, 2021

Codecov Report

Merging #7321 (76542c0) into master (e0c64f0) will decrease coverage by 7%.
The diff coverage is 100%.

@@           Coverage Diff            @@
##           master   #7321     +/-   ##
========================================
- Coverage      87%     81%     -7%     
========================================
  Files         200     200             
  Lines       12865   14380   +1515     
========================================
+ Hits        11210   11588    +378     
- Misses       1655    2792   +1137     

@awaelchli
Copy link
Contributor

awaelchli commented May 3, 2021

Found the issue. In the Tuner class line 46 we need to insert something like this:

            if isinstance(self.trainer.auto_scale_batch_size, str):
                scale_batch_size_kwargs.setdefault("mode", self.trainer.auto_scale_batch_size)

Thanks for adding the breaking test, that's perfect!!

@awaelchli awaelchli added bug Something isn't working tuner labels May 3, 2021
@awaelchli awaelchli added this to the v1.3 milestone May 3, 2021
@awaelchli awaelchli self-assigned this May 3, 2021
@edenlightning edenlightning removed this from the v1.3 milestone May 4, 2021
Co-authored-by: Adrian Wälchli <[email protected]>
@carmocca
Copy link
Contributor

carmocca commented May 4, 2021

we need to insert something like this:

Exactly, sorry I broke this recently!

Pushed the fix

@carmocca carmocca added this to the v1.3 milestone May 4, 2021
@carmocca carmocca added the ready PRs ready to be merged label May 4, 2021
Copy link
Member

@SkafteNicki SkafteNicki left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@SkafteNicki SkafteNicki enabled auto-merge (squash) May 4, 2021 11:54
@SkafteNicki SkafteNicki merged commit 5db832f into Lightning-AI:master May 4, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working ready PRs ready to be merged tuner
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Auto scaling of batch size always runs on power mode.
6 participants