-
Notifications
You must be signed in to change notification settings - Fork 31.4k
🔴 [VLM] Add base model without head #37033
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from 3 commits
3a87a41
8d7088a
95ac049
f0e917e
85b1e7a
5e4d0e8
02e7b6e
c0e41e6
ef70523
06b8227
5655657
083b9bc
4fe8a82
2e6caa4
d397075
0d1409f
a32e47e
5c019fe
32a67b1
a9b3816
1f7172c
1e5ee3b
c6bfa8d
7631fdb
da33a04
141c102
ba58575
4a73546
4d4ae05
6298cc4
a25e02d
3bbf3fd
8087394
32cbc87
43999e8
43639f4
d31a4c9
e7ff08c
c265726
db069f1
d309ead
f5b18eb
c58c4f2
9971e7f
f601c52
4e617b4
df62bdf
2509f77
ce4374b
24d127f
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change | ||||
|---|---|---|---|---|---|---|
|
|
@@ -1787,6 +1787,8 @@ class PreTrainedModel(nn.Module, ModuleUtilsMixin, GenerationMixin, PushToHubMix | |||||
| main_input_name = "input_ids" | ||||||
| model_tags = None | ||||||
|
|
||||||
| _key_mapping = None # used for BC support in VLMs, not meant to be used by new models | ||||||
|
|
||||||
| _auto_class = None | ||||||
| _no_split_modules = None | ||||||
| _skip_keys_device_placement = None | ||||||
|
|
@@ -4067,7 +4069,7 @@ def from_pretrained( | |||||
| generation_config = kwargs.pop("generation_config", None) | ||||||
| gguf_file = kwargs.pop("gguf_file", None) | ||||||
| tp_plan = kwargs.pop("tp_plan", None) | ||||||
| key_mapping = kwargs.pop("key_mapping", None) | ||||||
| key_mapping = kwargs.pop("key_mapping", cls._key_mapping) | ||||||
|
||||||
| key_mapping = kwargs.pop("key_mapping", cls._key_mapping) | |
| key_mapping = kwargs.pop("key_mapping", getattr(cls, "._key_mapping", None)) |
But just a bit worried if it could become an issue when saving then reloading?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, only case. I verified by saving and reloading that it works. We can remove if you think it's not needed, imo the new classes do not have to be loadable in transformers. in vLLM they use their own loader and I am adding these mapping on their end as well
Uh oh!
There was an error while loading. Please reload this page.