Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix bug for torch.uint1-7 not support in torch<2.6 #10368

Closed
wants to merge 1 commit into from

Conversation

baymax591
Copy link
Contributor

What does this PR do?

fix bug for torch.uint1-7 not support in torch<2.6

Before submitting

before this PR

When I execute the following command

python -c "from diffusers import AutoPipelineForText2Image"

get the following error:

Traceback (most recent call last):
  File "/root/miniconda3/envs/baymax/lib/python3.10/site-packages/diffusers/utils/import_utils.py", line 920, in _get_module
    return importlib.import_module("." + module_name, self.__name__)
  File "/root/miniconda3/envs/baymax/lib/python3.10/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/root/miniconda3/envs/baymax/lib/python3.10/site-packages/diffusers/pipelines/auto_pipeline.py", line 21, in <module>
    from ..models.controlnets import ControlNetUnionModel
  File "/root/miniconda3/envs/baymax/lib/python3.10/site-packages/diffusers/models/controlnets/__init__.py", line 5, in <module>
    from .controlnet import ControlNetModel, ControlNetOutput
  File "/root/miniconda3/envs/baymax/lib/python3.10/site-packages/diffusers/models/controlnets/controlnet.py", line 22, in <module>
    from ...loaders.single_file_model import FromOriginalModelMixin
  File "/root/miniconda3/envs/baymax/lib/python3.10/site-packages/diffusers/loaders/single_file_model.py", line 23, in <module>
    from ..quantizers import DiffusersAutoQuantizer
  File "/root/miniconda3/envs/baymax/lib/python3.10/site-packages/diffusers/quantizers/__init__.py", line 15, in <module>
    from .auto import DiffusersAutoQuantizer
  File "/root/miniconda3/envs/baymax/lib/python3.10/site-packages/diffusers/quantizers/auto.py", line 31, in <module>
    from .torchao import TorchAoHfQuantizer
  File "/root/miniconda3/envs/baymax/lib/python3.10/site-packages/diffusers/quantizers/torchao/__init__.py", line 15, in <module>
    from .torchao_quantizer import TorchAoHfQuantizer
  File "/root/miniconda3/envs/baymax/lib/python3.10/site-packages/diffusers/quantizers/torchao/torchao_quantizer.py", line 45, in <module>
    torch.uint1,
  File "/root/miniconda3/envs/baymax/lib/python3.10/site-packages/torch/__init__.py", line 1833, in __getattr__
    raise AttributeError(f"module '{__name__}' has no attribute '{name}'")
AttributeError: module 'torch' has no attribute 'uint1'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "<frozen importlib._bootstrap>", line 1075, in _handle_fromlist
  File "/root/miniconda3/envs/baymax/lib/python3.10/site-packages/diffusers/utils/import_utils.py", line 911, in __getattr__
    value = getattr(module, name)
  File "/root/miniconda3/envs/baymax/lib/python3.10/site-packages/diffusers/utils/import_utils.py", line 910, in __getattr__
    module = self._get_module(self._class_to_module[name])
  File "/root/miniconda3/envs/baymax/lib/python3.10/site-packages/diffusers/utils/import_utils.py", line 922, in _get_module
    raise RuntimeError(
RuntimeError: Failed to import diffusers.pipelines.auto_pipeline because of the following error (look up to see its traceback):
module 'torch' has no attribute 'uint1'

The reason is that lower versions of torch do not support torch.uint1.
I added judgment on the torch version in the PR to avoid errors,and deleted redundant code.

torch.uint7,
)

if version.parse(torch.__version__) >= version.parse('2.6'):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oh thanks so much!
can we use is_torch_version?

def is_torch_version(operation: str, version: str):

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you! Apologies that I missed this 😅

I think this should be something like:

if version <= 2.5:
    # do not use any of the uintx
else:
    # use the uintx types as well

I believe the uintx data types were added in 2.5 and not 2.6

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also maybe good to have another branch that adds float8_e4m3fn and float8_e5m2 only if torch version is >= 2.2. int8 came in 2.0 I believe but since minimum diffusers requirement is torch>=1.4, another branch for it might be needed. LMK if I can help with any changes

@yiyixuxu
Copy link
Collaborator

are you able to run from diffusers import StableDiffusion3Pipeline?

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

yiyixuxu added a commit that referenced this pull request Dec 24, 2024
* fix bug for torch.uint1-7 not support in torch<2.6

* up

---------

Co-authored-by: baymax591 <[email protected]>
@yiyixuxu
Copy link
Collaborator

hi @baymax591
Thanks so much for the PR! we finished it up and get it merged in this PR #10368
(we normally ask before we take over a PR, because it is a bug, and we wanted to fix quickly! sorry!) your commit is there in the PR and you're an author :)

closing this one since it is merged

@yiyixuxu yiyixuxu closed this Dec 24, 2024
a-r-r-o-w pushed a commit that referenced this pull request Dec 25, 2024
* fix bug for torch.uint1-7 not support in torch<2.6

* up

---------

Co-authored-by: baymax591 <[email protected]>
clrpackages pushed a commit to clearlinux-pkgs/pypi-diffusers that referenced this pull request Jan 16, 2025
…ersion 0.32.1

Aryan (2):
      Fix TorchAO related bugs; revert device_map changes (#10371)
      Release: v0.32.1

Sayak Paul (1):
      fix test pypi installation in the release workflow (#10360)

YiYi Xu (1):
      make style for huggingface/diffusers#10368 (#10370)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants