Skip to content

Conversation

@younesbelkada
Copy link
Contributor

What does this PR do?

This PR reverts a change that has been made on M2M100Model , to reproduce you can run:

import torch
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer

src_lang = "eng_Latn"
tgt_lang = "spa_Latn"

model_id = "facebook/nllb-200-3.3B"

tokenizer = AutoTokenizer.from_pretrained(model_id, src_lang=src_lang)
model = AutoModelForSeq2SeqLM.from_pretrained(model_id, torch_dtype=torch.float16, device_map="auto")

on main, users get:

│ /home/younes_huggingface_co/debug_issues/code/transformers/src/transformers/modeling_utils.py:24 │
│ 59 in _load_pretrained_model                                                                     │
│                                                                                                  │
│   2456 │   │   │   for key in missing_keys:                                                      │
│   2457 │   │   │   │   if key.startswith(prefix):                                                │
│   2458 │   │   │   │   │   key = ".".join(key.split(".")[1:])                                    │
│ ❱ 2459 │   │   │   │   param = model_state_dict[key]                                             │
│   2460 │   │   │   │   if param.device == torch.device("meta"):                                  │
│   2461 │   │   │   │   │   if not load_in_8bit:                                                  │
│   2462 │   │   │   │   │   │   set_module_tensor_to_device(model, key, "cpu", torch.empty(*para  │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
KeyError: 'encoder.embed_positions.weights'

cc @Narsil

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint.

Copy link
Contributor

@Narsil Narsil left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Collaborator

@sgugger sgugger left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for fixing!

@younesbelkada younesbelkada merged commit c0fe912 into huggingface:main Nov 22, 2022
mpierrau pushed a commit to mpierrau/transformers that referenced this pull request Dec 15, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants