-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ignore parameters causing ValueError when dumping to YAML #19804
Conversation
Having the same issue, this fix would be useful. |
This fix should proceed soon. Otherwise, no way to test multiple models. |
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## master #19804 +/- ##
==========================================
- Coverage 84% 53% -31%
==========================================
Files 426 418 -8
Lines 35280 35127 -153
==========================================
- Hits 29616 18708 -10908
- Misses 5664 16419 +10755 |
Thank you @Callidior, great fix! |
What does this PR do?
Fixes #19730
Unserializable parameters such as nn.Modules or Tensors often fail with a
ValueError
when callingyaml.dump
on them and using PyTorch 2.x.Lightning already tries to handle this type of error but only catches
TypeError
so far.This PR proposes to additionally catch
ValueError
.Before submitting
PR review
Anyone in the community is welcome to review the PR.
Before you start reviewing, make sure you have read the review guidelines. In short, see the following bullet-list:
Reviewer checklist
📚 Documentation preview 📚: https://pytorch-lightning--19804.org.readthedocs.build/en/19804/