Skip to content

Conversation

@ydshieh
Copy link
Collaborator

@ydshieh ydshieh commented Dec 20, 2022

What does this PR do?

Lilt, Longformer and Canine only implements encoder-only task heads (QA, Sequence/Token classification etc.), and use_cache is not used in their modeling file.

@ydshieh ydshieh requested a review from NielsRogge December 20, 2022 10:36
Copy link
Contributor

@NielsRogge NielsRogge left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for fixing!

@ydshieh ydshieh requested a review from LysandreJik December 20, 2022 10:44
@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Dec 20, 2022

The documentation is not available anymore as the PR was closed or merged.

Copy link
Member

@LysandreJik LysandreJik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you, @ydshieh!

@ydshieh ydshieh merged commit 2280880 into main Dec 20, 2022
@ydshieh ydshieh deleted the cleanup_config_attrs_5 branch December 20, 2022 15:46
MKhalusova pushed a commit to MKhalusova/transformers that referenced this pull request Dec 28, 2022
remove unused use_cache in config classes

Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
silverriver pushed a commit to silverriver/transformers that referenced this pull request Jan 6, 2023
remove unused use_cache in config classes

Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants