Skip to content

Conversation

@YenFuLin
Copy link
Contributor

What does this PR do?

The function, resizing_token_embeddings(), is defined in model_utils.py, but in some VLMs overrided it and forget adding this new argument, mean_resizing, so that user cannot change mean_resizing value. However, I think this argument shall be flexible to VLMs so that I refine it.

Fixes #35357

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

@Rocketknight1

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

Copy link
Member

@Rocketknight1 Rocketknight1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, this looks good to me! Mean resizing should be the default for most token embeddings in future, and this PR cleanly passes the option through to the super method.

cc @ArthurZucker for core maintainer review!

@qubvel qubvel removed their request for review January 20, 2025 17:46
Copy link
Collaborator

@ArthurZucker ArthurZucker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep very good catch! This was already on by default, so all good not breaking!˜

@ArthurZucker ArthurZucker merged commit 9d2056f into huggingface:main Feb 3, 2025
16 checks passed
elvircrn pushed a commit to elvircrn/transformers that referenced this pull request Feb 13, 2025
…gface#35717)

* refine all resize_token_embedding()

* ruff format

* hotfix
sbucaille pushed a commit to sbucaille/transformers that referenced this pull request Feb 16, 2025
…gface#35717)

* refine all resize_token_embedding()

* ruff format

* hotfix
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Default value for mean_resizing in resize_token_embeddings should be False

3 participants