Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update learning rate on each backward pass instead of each forward pass. #1477

Merged
merged 8 commits into from
Apr 20, 2020

Conversation

rmrao
Copy link
Contributor

@rmrao rmrao commented Apr 13, 2020

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure to update the docs?
  • Did you write any new necessary tests?
  • If you made a notable change (that affects users), did you update the CHANGELOG?

What does this PR do?

Fixes #1476.

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

@mergify mergify bot requested a review from a team April 13, 2020 18:41
@codecov
Copy link

codecov bot commented Apr 13, 2020

Codecov Report

Merging #1477 into master will increase coverage by 0%.
The diff coverage is 100%.

@@          Coverage Diff           @@
##           master   #1477   +/-   ##
======================================
  Coverage      90%     90%           
======================================
  Files          68      68           
  Lines        3804    3805    +1     
======================================
+ Hits         3441    3442    +1     
  Misses        363     363           

Copy link
Member

@justusschock justusschock left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@justusschock justusschock requested a review from a team April 14, 2020 08:03
@Borda Borda added bug Something isn't working ready PRs ready to be merged labels Apr 14, 2020
@Borda Borda added this to the 0.7.4 milestone Apr 14, 2020
@Borda Borda requested a review from a team April 14, 2020 08:34
Copy link
Contributor

@williamFalcon williamFalcon left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't understand...

Currently, update_learning_rates is called AFTER .backward, is it not?

The call to:
run_training_batch
does a forward AND backward pass.

And the order is as follows:

run_training_batch
update_learning_rates

@mergify mergify bot requested a review from a team April 14, 2020 14:23
@Borda Borda removed the ready PRs ready to be merged label Apr 14, 2020
@rmrao
Copy link
Contributor Author

rmrao commented Apr 14, 2020

I believe at the moment that run_training_batch runs a forward pass, but only runs a backwards pass if self.batch_idx + 1 % self.accumulate_grac_batches == 0 (see here). This is consistent with how the global step is incremented in run_training_epoch (here).

I didn't move the learning rate update line - just added a condition over it. It might be better / cleaner to move it somewhere else, or to try and move it inside of run_training_batch. Doing that would require rewriting how self.update_learning_rates works, however.

@williamFalcon
Copy link
Contributor

ok got it. that makes sense. it does seem a little dirty at the moment.
Mind adding a todo comment on top of your condition so we know to look into this later?

@mergify
Copy link
Contributor

mergify bot commented Apr 15, 2020

This pull request is now in conflict... :(

@pep8speaks
Copy link

pep8speaks commented Apr 15, 2020

Hello @rmrao! Thanks for updating this PR.

There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻

Comment last updated at 2020-04-20 11:04:10 UTC

@rmrao
Copy link
Contributor Author

rmrao commented Apr 15, 2020

Done - added a TODO to potentially merge optimizer step, lr update, and global step increment.

@mergify
Copy link
Contributor

mergify bot commented Apr 16, 2020

This pull request is now in conflict... :(

@Borda Borda added the ready PRs ready to be merged label Apr 16, 2020
@Borda Borda requested review from williamFalcon and Borda April 16, 2020 21:58
@mergify mergify bot requested a review from a team April 16, 2020 21:59
@williamFalcon williamFalcon merged commit 0203938 into Lightning-AI:master Apr 20, 2020
@Borda Borda modified the milestones: 0.7.4, v0.7.x Apr 18, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working ready PRs ready to be merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Learning rate scheduler should step after each optimizer step
5 participants