Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add datamodule parameter to lr_find() #3425

Merged
merged 13 commits into from
Oct 1, 2020

Conversation

GimmickNG
Copy link
Contributor

@GimmickNG GimmickNG commented Sep 9, 2020

What does this PR do?

Fixes #3424 - Adds datamodule support to Trainer.lr_find() - a LightningDataModule can be passed to lr_find().

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together? Otherwise, we ask you to create a separate PR for every change.
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests?
  • Did you verify new and existing tests pass locally with your changes?
  • If you made a notable change (that affects users), did you update the CHANGELOG?

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

@GimmickNG GimmickNG marked this pull request as ready for review September 9, 2020 17:01
@mergify mergify bot requested a review from a team September 9, 2020 17:02
@codecov
Copy link

codecov bot commented Sep 9, 2020

Codecov Report

Merging #3425 into master will increase coverage by 8%.
The diff coverage is 100%.

@@           Coverage Diff           @@
##           master   #3425    +/-   ##
=======================================
+ Coverage      83%     90%    +8%     
=======================================
  Files         108     108            
  Lines        8807    8386   -421     
=======================================
+ Hits         7284    7584   +300     
+ Misses       1523     802   -721     

@williamFalcon
Copy link
Contributor

does this work?
can you add a test?

@mergify
Copy link
Contributor

mergify bot commented Sep 10, 2020

This pull request is now in conflict... :(

@GimmickNG
Copy link
Contributor Author

Yeah, it does - AFAIK all it does is forward the datamodule parameter to the fit() method for the temporary trainer created in lr_find().

I'm not quite sure how to write tests but I'll write them once I figure out how to do so.

@SkafteNicki
Copy link
Member

@GimmickNG changes look fine.
For testing, you can write a altered version of this test:
https://github.com/PyTorchLightning/pytorch-lightning/blob/5abf7d91232322e7f87fe65ad490245d9c1894c4/tests/trainer/test_lr_finder.py#L133-L152
where you remove all dataloaders in the model and then feed in a datamodule to the lr_find method. If the search finish the data is passed correctly.

@Borda Borda added the bug Something isn't working label Sep 10, 2020
pytorch_lightning/trainer/lr_finder.py Outdated Show resolved Hide resolved
@mergify mergify bot requested a review from a team September 10, 2020 13:33
@GimmickNG
Copy link
Contributor Author

@GimmickNG changes look fine.
For testing, you can write a altered version of this test:

https://github.com/PyTorchLightning/pytorch-lightning/blob/5abf7d91232322e7f87fe65ad490245d9c1894c4/tests/trainer/test_lr_finder.py#L133-L152

where you remove all dataloaders in the model and then feed in a datamodule to the lr_find method. If the search finish the data is passed correctly.

I've added a test for testing the datamodule parameter. I'm not sure if I've done it correctly, though.

[I'd written a test which checked whether passing the datamodule parameter works when calling trainer.tune(model) with Trainer(auto_lr_find=True), because I was under the impression that calling tune() would be different to calling tuner.lr_find() - I've rolled the changes back and simplified the test to the one suggested by @SkafteNicki]

@mergify mergify bot requested a review from a team September 11, 2020 06:20
@SkafteNicki
Copy link
Member

We are currently doing a lot of internal refactors and the file pytorch_lightning/trainer/lr_finder.py therefore do not exist anymore (the conflicted file). Please redo the changes to this file: pytorch_lightning/tuner/lr_finder.py.

@GimmickNG
Copy link
Contributor Author

OK.

@edenlightning
Copy link
Contributor

Hey @GimmickNG! Refactors are mostly done, mind taking another look?

@GimmickNG
Copy link
Contributor Author

Hey @GimmickNG! Refactors are mostly done, mind taking another look?

Sorry, I've been rather busy the past week. Should I create another pull request with the updated version of the fork?

@Borda
Copy link
Member

Borda commented Sep 17, 2020

Hey @GimmickNG! Refactors are mostly done, mind taking another look?

Sorry, I've been rather busy the past week. Should I create another pull request with the updated version of the fork?

I think that resolving conflict is fine (rebase on actual master)

@GimmickNG
Copy link
Contributor Author

Sorry, I can't seem to do that on the web client and I'm rather inexperienced with this. How would I get Git to recognize the file's been moved to a different module during the rebase?

@SkafteNicki
Copy link
Member

@GimmickNG I merged master into your code and fixed conflicts. Failing checks seems to be unrelated to PR. Please take a look and make sure the PR is still doing what you intended, then we can go ahead and approve changes.

Co-authored-by: Carlos Mocholí <[email protected]>
@GimmickNG
Copy link
Contributor Author

@GimmickNG I merged master into your code and fixed conflicts. Failing checks seems to be unrelated to PR. Please take a look and make sure the PR is still doing what you intended, then we can go ahead and approve changes.

Thanks! I checked it and trainer.tuner.lr_find(model, datamodule=data_module) works, at least on my end. I think it should be fine to close.

@Borda Borda requested review from SkafteNicki and Borda September 25, 2020 12:22
Copy link
Member

@Borda Borda left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

pytorch_lightning/tuner/tuning.py Outdated Show resolved Hide resolved
@mergify mergify bot requested a review from a team September 25, 2020 12:29
@Borda
Copy link
Member

Borda commented Sep 25, 2020

@GimmickNG I merged master into your code and fixed conflicts. Failing checks seems to be unrelated to PR. Please take a look and make sure the PR is still doing what you intended, then we can go ahead and approve changes.

Thanks! I checked it and trainer.tuner.lr_find(model, datamodule=data_module) works, at least on my end. I think it should be fine to close.

I would keep it for merge as it adds test for the before/after...

@mergify mergify bot requested a review from a team September 25, 2020 12:34
Copy link
Member

@SkafteNicki SkafteNicki left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@mergify mergify bot requested a review from a team September 25, 2020 12:40
@pep8speaks
Copy link

pep8speaks commented Sep 27, 2020

Hello @GimmickNG! Thanks for updating this PR.

There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻

Comment last updated at 2020-09-30 13:57:10 UTC

@SkafteNicki SkafteNicki added the ready PRs ready to be merged label Sep 28, 2020
@williamFalcon
Copy link
Contributor

Please rebae master...
cc @Borda

@SkafteNicki SkafteNicki merged commit e4e60e9 into Lightning-AI:master Oct 1, 2020
@Borda Borda added this to the 0.10.0 milestone Oct 7, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working ready PRs ready to be merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

DataModule with lr_find not supported
9 participants