-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Prune deprecated utils modules #7503
Conversation
Codecov Report
@@ Coverage Diff @@
## master #7503 +/- ##
=======================================
- Coverage 92% 88% -4%
=======================================
Files 198 196 -2
Lines 12843 12834 -9
=======================================
- Hits 11839 11270 -569
- Misses 1004 1564 +560 |
for more information, see https://pre-commit.ci
I think we should address this in this PR cc: @awaelchli |
from pytorch_lightning.utilities.argparse import * # noqa: F403 E402 F401 | ||
|
||
# for backward compatibility with old checkpoints (versions < 1.2.0) | ||
# that need to be able to unpickle the function from the checkpoint | ||
from pytorch_lightning.utilities.argparse import _gpus_arg_default # noqa: E402 F401 # isort: skip |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@carmocca In theory this should be fine because we still have the import below here for BC.
Unless we remove the file argparse_utils.py we don't need to worry (yet)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The problem here is that the deprecation tests have been removed but not the deprecation message
And if we do, this will be forgotten for eternity haha
Is that okay?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think that the best way would be to add a test for legacy checkpoint which covers this use-case
at this moment we have really very simple legacy model to test with...
so I would merge this as it is and raise a priority issue to update legacy checkpoint tests 🐰
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
And if we do, this will be forgotten for eternity haha
Is that okay?
No, I will never forget the pain it has caused me. 🤣
If you want, I can generate one of these checkpoints and we make a tests that loads it? If someone removes the line, we make sure the test breaks?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nah, no need for that. If spending time on it, it should be on how to figure out the monkey patching
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, can you prepare a model which would cover those issue
https://github.com/PyTorchLightning/pytorch-lightning/blob/master/legacy/zero_training.py
then we can regenerate all legacy checkpoints
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
all we need to do is add self.save_hyperparameters()
to the zero_training model and generate the default hparams form Trainer.parse_argparse_args
. The problem is that save_hyperparameters()
doesn't exist that long and changed over time so I fear we won't be able to get this working for too old versions of Lightning. So we may need multiple versions of scripts as the Lightning feature set evolves over time
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we can regenerate just some "recent" checkpoints...
cc: @kaushikb11
What does this PR do?
remove deprecated modules, they were renames in past
Before submitting
PR review
Anyone in the community is free to review the PR once the tests have passed.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:
Did you have fun?
Make sure you had fun coding 🙃