Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

question about load sanapipeline #134

Open
LearningHx opened this issue Jan 8, 2025 · 3 comments
Open

question about load sanapipeline #134

LearningHx opened this issue Jan 8, 2025 · 3 comments

Comments

@LearningHx
Copy link

pipe = SanaPipeline.from_pretrained(
"Efficient-Large-Model/Sana_1600M_1024px_BF16_diffusers",
variant="bf16",
torch_dtype=torch.bfloat16,
)
I encountered an error when loading the model with the code above:OSError: Error no file named pytorch_model.bin, model.safetensors, tf_model.h5, model.ckpt.index or flax_model.msgpack found in directory /home/ma-user/work/hexiao/Sana-main_distillation/weights/Sana_600M_512px-diffusers/text_encoder.

I found the pretrained text encoder in huggingface named model-00001-of-00002.safetensors and model-00002-of-00002.safetensors. I want to know what these two weights represent and how I should modify the code to load the model correctly

@lawrence-cj
Copy link
Collaborator

run pip install git+https://github.com/huggingface/diffusers before use Sana in diffusers

import torch
from diffusers import SanaPipeline

pipe = SanaPipeline.from_pretrained(
    "Efficient-Large-Model/Sana_1600M_1024px_BF16_diffusers",
    variant="bf16",
    torch_dtype=torch.bfloat16,
)

What's your diffusers version

@LearningHx
Copy link
Author

I have installed diffusers==0.32.1

@lawrence-cj
Copy link
Collaborator

You may need to upgrade your hunggaface_hub, transformers lib simultaneously.
I haven't met the same problem before.
You don't need to modify any weight.

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants