-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update learning rate on each backward pass instead of each forward pass. #1477
Conversation
…ead of every forwards pass
Codecov Report
@@ Coverage Diff @@
## master #1477 +/- ##
======================================
Coverage 90% 90%
======================================
Files 68 68
Lines 3804 3805 +1
======================================
+ Hits 3441 3442 +1
Misses 363 363 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't understand...
Currently, update_learning_rates
is called AFTER .backward, is it not?
The call to:
run_training_batch
does a forward AND backward pass.
And the order is as follows:
run_training_batch
update_learning_rates
I believe at the moment that I didn't move the learning rate update line - just added a condition over it. It might be better / cleaner to move it somewhere else, or to try and move it inside of |
ok got it. that makes sense. it does seem a little dirty at the moment. |
This pull request is now in conflict... :( |
Hello @rmrao! Thanks for updating this PR. There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻 Comment last updated at 2020-04-20 11:04:10 UTC |
Done - added a TODO to potentially merge optimizer step, lr update, and global step increment. |
This pull request is now in conflict... :( |
Before submitting
What does this PR do?
Fixes #1476.
PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Did you have fun?
Make sure you had fun coding 🙃