Don't modify tied_weight_keys in-place#43619
Conversation
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
vasqu
left a comment
There was a problem hiding this comment.
Just a bit confused about the test since we check for the same keys? Does it work as intended
vasqu
left a comment
There was a problem hiding this comment.
Thanks for clarifiying, my bad didn't properly see the second case
| # Ignore copy | ||
| def test_tie_weights_is_not_modified(self): | ||
| # this model doesn't need a test | ||
| pass |
There was a problem hiding this comment.
Just passing through, why doesn't this need a test? (wrt comment)
There was a problem hiding this comment.
the model definition is slightly different from GroundingDino but the tests are Copied from. I don't see tied_weights_keys being modified for MMGroundingDino
|
CI is still super flaky :( |
|
[For maintainers] Suggested jobs to run (before merge) run-slow: deformable_detr, grounding_dino, mm_grounding_dino |
|
As discussed CI is still flaky, merging as the test that fails is unrelated - can you write something internally in slack |
What does this PR do?
As per title, some models add or delete entries in tied weights depending on configuration. If we load two models consecutively with different configs, it fails to tie weights correctly
I am copying it in
__init__same way askeep_in_fp32. We could also simplycopythe cls attribute in these two models, where in-place modification happens. WDYT?