Skip to content

Conversation

@gante
Copy link
Contributor

@gante gante commented Jan 21, 2025

What does this PR do?

Fixes a few GPT2 generation tests that are failing since #34026. Previously, max_length=20 was implicit, this PR makes it explicit.

[I'm making sure generation tests on the main models are green to ensure #35802 won't break beam search in any form]

@gante gante requested a review from ydshieh January 21, 2025 19:13
Copy link
Collaborator

@ydshieh ydshieh left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you.

Feel free to merge 🤗

@gante gante merged commit 36c9181 into huggingface:main Jan 22, 2025
16 checks passed
@gante gante deleted the fix_gpt2_tests branch January 22, 2025 09:41
bursteratom pushed a commit to bursteratom/transformers that referenced this pull request Jan 31, 2025
elvircrn pushed a commit to elvircrn/transformers that referenced this pull request Feb 13, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants