-
Notifications
You must be signed in to change notification settings - Fork 31.6k
Fix TF generate (probably)
#21301
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix TF generate (probably)
#21301
Conversation
|
@ArthurZucker The PR #20944 failed the test tests/models/encoder_decoder/test_modeling_tf_encoder_decoder.py::TFBertEncoderDecoderModelTest::test_bert2bert_summarizationwhile one commit before With #20944, the outputs from the above is somehow gibberish. We can wait this PR being merged , then could you take a look of this issue 🙏 ? Here is the traceback E AssertionError: Lists differ: ['sa sa sa university sa sa sigma sa sa th[501 chars] sa'] != ["sae was founded in 1856, five years befo[236 chars]hs."]
E
E First differing element 0:
E 'sa sa sa university sa sa sigma sa sa th[500 chars]a sa'
E "sae was founded in 1856, five years befo[235 chars]ths."
E
E Diff is 897 characters long. Set self.maxDiff to None to see it. |
|
The documentation is not available anymore as the PR was closed or merged. |
|
Basically this is because there hast to be a |
Thank you. We are going to have 0 test failures soon! |
|
Instead of adding this extra if to handle The fix should be to remove |
93a9540 to
98aae9a
Compare
|
@gante Thanks! Updated the PR :-) |
gante
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for fixing 💛
|
@gante I merged this PR as it is. However, we could potentially improve the code in |
What does this PR do?
We have CI failures for
TFBertEncoderDecoderModelTest.test_bert2bert_summarizationandTFGPT2EncoderDecoderModelTest.test_bert2gpt2_summarization.The error message is
These 2 tests pass
max_length=Nonetogenerate:and this line (in
generate)transformers/src/transformers/generation/tf_utils.py
Line 613 in 63b204e
change
generation_config.max_lengthfrom20(the default value) toNone, and finally we get error attransformers/src/transformers/generation/tf_utils.py
Line 719 in 63b204e
This PR check if
generation_config.max_length is not Nonebefore doing comparison - the 2 tests pass with this change.But we need @gante to see if this is the right fix.