-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support checkpoint hooks on data module #3563
Conversation
Hello @ananthsub! Thanks for updating this PR. There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻 Comment last updated at 2020-09-28 23:30:16 UTC |
b20dc2c
to
f1f90b5
Compare
This pull request is now in conflict... :( |
5bb71e2
to
0a7346d
Compare
Codecov Report
@@ Coverage Diff @@
## master #3563 +/- ##
======================================
- Coverage 91% 91% -0%
======================================
Files 110 110
Lines 8206 8211 +5
======================================
- Hits 7463 7457 -6
- Misses 743 754 +11 |
0a7346d
to
aa6b96a
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we pls stop using meaningless Black formatting, e.g replacing " and ' or inconsistently splitting argument lines... this makes the PR terribly long even for minimal changes
add to you your Black config -S
Split out changes from Lightning-AI#3563 to make that PR easier to review
Split out changes from Lightning-AI#3563 to make that PR easier to review. This formats the file according to the Black formatter
Split out changes from Lightning-AI#3563 to make that PR easier to review. This formats the file according to the Black formatter
…view. This formats the file according to the Black formatter
… more storage backends (#3694) * Split out changes from #3563 to make that PR easier to review. This formats the file according to the Black formatter * Store a reference to the trainer on the datamodule Fixes #3682 * Update data_connector.py * Update data_connector.py * Update test_datamodules.py * Support more storage backends in trainer.test using best weights Similar to #3692 * Update trainer.py * Update trainer.py use cloud_io load directly
This pull request is now in conflict... :( |
…view. This formats the file according to the Black formatter
refactor on_{save/load}_checkpoint to a separate hook class that both the lightning module and data module inherit add spots in callback connector to call new datamodule hooks if available
checkout upstream/master
ccec667
to
e3bec6c
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
wrote up some local scripts to play around with this. it works as expected.
LGTM 🚀
…view. This formats the file according to the Black formatter
* Split out changes from #3563 to make that PR easier to review. This formats the file according to the Black formatter * Store a reference to the trainer on the datamodule Fixes #3682 * Update data_connector.py * Update data_connector.py * Update test_datamodules.py * Add attribute to datamodule for trainer
…view. This formats the file according to the Black formatter
What does this PR do?
Solves #3544
CheckpointHooks
which both the LightningModule and LightningDataModule inheritI'd appreciate comments as to whether the order here matters for saving/loading. Ideally the checkpoint state is independent across data and model, but if there are cases where the model depends on the data, should we prefer to call one's save/load before the other?
TODO:
Before submitting
PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Did you have fun?
Make sure you had fun coding 🙃