Skip to content

Conversation

@gante
Copy link
Contributor

@gante gante commented Sep 5, 2023

What does this PR do?

Thank you @nick-maykr for raising the related issue.

In a nutshell, at some point in the seq2seq trainer, if max_length and num_beams were not set through the legacy arguments, we were fetching them from model.config. This is wrong -- all the logic about defaults is now handled in model.generation_config, and doesn't need any explicit value setting.

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Sep 5, 2023

The documentation is not available anymore as the PR was closed or merged.

Copy link
Contributor

@amyeroberts amyeroberts left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for fixing!

@gante gante merged commit 9a70d6e into huggingface:main Sep 5, 2023
@gante gante deleted the max_length_warning branch September 5, 2023 13:47
parambharat pushed a commit to parambharat/transformers that referenced this pull request Sep 26, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants