Skip to content

Conversation

@zucchini-nlp
Copy link
Member

What does this PR do?

As per title, we can't copy from llama or any other LLM because Gemma3 needs to obtain text_config params and needs to pass extra vision kwargs in forward. Thus the code was adapted from llama and the tests are green

Fixes #36755

@zucchini-nlp zucchini-nlp requested a review from Cyrilvallez July 17, 2025 07:08
@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Comment on lines +353 to +360
@unittest.skip("Loading nested configs with overwritten `kwargs` isn't supported yet, FIXME @raushan.")
def test_load_with_mismatched_shapes(self):
pass

@unittest.skip("Loading nested configs with overwritten `kwargs` isn't supported yet, FIXME @raushan.")
def test_mismatched_shapes_have_properly_initialized_weights(self):
pass

Copy link
Member Author

@zucchini-nlp zucchini-nlp Jul 17, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here I mean loading like Gemma3Config.from_dict(config_dict, vocab_size=100) where the vocab_size is actually part of config.text_config. It is a known issue and I have it in my plans to support

@github-actions
Copy link
Contributor

[For maintainers] Suggested jobs to run (before merge)

run-slow: auto, gemma3

Copy link
Member

@Cyrilvallez Cyrilvallez left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Alright, but don't we have any other vlm from which we can inherit directly with modular?

@zucchini-nlp
Copy link
Member Author

Nope, we usually don't add when there's no official pretrained checkpoint

@zucchini-nlp zucchini-nlp merged commit e42681b into huggingface:main Jul 21, 2025
25 checks passed
zucchini-nlp added a commit to zucchini-nlp/transformers that referenced this pull request Jul 22, 2025
* add seq clf class

* fix docs and add in auto-map

* skip tests

* optional pixels
zaristei pushed a commit to zaristei/transformers that referenced this pull request Sep 9, 2025
* add seq clf class

* fix docs and add in auto-map

* skip tests

* optional pixels
zaristei pushed a commit to zaristei/transformers that referenced this pull request Sep 9, 2025
* add seq clf class

* fix docs and add in auto-map

* skip tests

* optional pixels
zaristei pushed a commit to zaristei/transformers that referenced this pull request Sep 9, 2025
* add seq clf class

* fix docs and add in auto-map

* skip tests

* optional pixels
zaristei pushed a commit to zaristei/transformers that referenced this pull request Sep 9, 2025
* add seq clf class

* fix docs and add in auto-map

* skip tests

* optional pixels
zaristei pushed a commit to zaristei/transformers that referenced this pull request Sep 9, 2025
* add seq clf class

* fix docs and add in auto-map

* skip tests

* optional pixels
zaristei pushed a commit to zaristei/transformers that referenced this pull request Sep 9, 2025
* add seq clf class

* fix docs and add in auto-map

* skip tests

* optional pixels
zaristei pushed a commit to zaristei/transformers that referenced this pull request Sep 9, 2025
* add seq clf class

* fix docs and add in auto-map

* skip tests

* optional pixels
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add Gemma 3 For Sequence Classification

4 participants