Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Colab issue tooks 9 minutes to complete #743

Closed
PaoloPgn opened this issue Mar 19, 2024 · 6 comments · Fixed by #746
Closed

Colab issue tooks 9 minutes to complete #743

PaoloPgn opened this issue Mar 19, 2024 · 6 comments · Fixed by #746

Comments

@PaoloPgn
Copy link

Hi there,
I use dynamic prompts in Colab and every time I have to add manually the command
!pip install dynamicprompts[attentiongrabber,magicprompt]~=0.30.4
Before the last updates using that command it took 7/9 seconds to complete, now it takes 9 minutes to complete, Why?
Is there is a way to have the same results in less time?
Thank you

@akx
Copy link
Collaborator

akx commented Mar 19, 2024

Without knowing or seeing what the command prints after that (or additional details about your environment), it's hard to tell.

@PaoloPgn
Copy link
Author

ok, next time I'll launch it, I'll post here the sequence of what the command is doing inside Colab (I erased it in the today session), so to better understand my issue, thank you for now...

@PaoloPgn
Copy link
Author

PaoloPgn commented Mar 20, 2024

This is what happens in 9 minutes:


Collecting dynamicprompts[attentiongrabber,magicprompt]~=0.30.4
  Downloading dynamicprompts-0.30.4-py3-none-any.whl (51 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 51.6/51.6 kB 1.7 MB/s eta 0:00:00
Requirement already satisfied: jinja2~=3.1 in /usr/local/lib/python3.10/dist-packages (from dynamicprompts[attentiongrabber,magicprompt]~=0.30.4) (3.1.2)
Requirement already satisfied: pyparsing~=3.0 in /usr/local/lib/python3.10/dist-packages (from dynamicprompts[attentiongrabber,magicprompt]~=0.30.4) (3.1.2)
Requirement already satisfied: transformers[torch]~=4.19 in /usr/local/lib/python3.10/dist-packages (from dynamicprompts[attentiongrabber,magicprompt]~=0.30.4) (4.28.1)
Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.10/dist-packages (from jinja2~=3.1->dynamicprompts[attentiongrabber,magicprompt]~=0.30.4) (2.1.5)
Requirement already satisfied: filelock in /usr/local/lib/python3.10/dist-packages (from transformers[torch]~=4.19->dynamicprompts[attentiongrabber,magicprompt]~=0.30.4) (3.13.1)
Requirement already satisfied: huggingface-hub<1.0,>=0.11.0 in /usr/local/lib/python3.10/dist-packages (from transformers[torch]~=4.19->dynamicprompts[attentiongrabber,magicprompt]~=0.30.4) (0.14.1)
Requirement already satisfied: numpy>=1.17 in /usr/local/lib/python3.10/dist-packages (from transformers[torch]~=4.19->dynamicprompts[attentiongrabber,magicprompt]~=0.30.4) (1.25.2)
Requirement already satisfied: packaging>=20.0 in /usr/local/lib/python3.10/dist-packages (from transformers[torch]~=4.19->dynamicprompts[attentiongrabber,magicprompt]~=0.30.4) (24.0)
Requirement already satisfied: pyyaml>=5.1 in /usr/local/lib/python3.10/dist-packages (from transformers[torch]~=4.19->dynamicprompts[attentiongrabber,magicprompt]~=0.30.4) (6.0.1)
Requirement already satisfied: regex!=2019.12.17 in /usr/local/lib/python3.10/dist-packages (from transformers[torch]~=4.19->dynamicprompts[attentiongrabber,magicprompt]~=0.30.4) (2023.12.25)
Requirement already satisfied: requests in /usr/local/lib/python3.10/dist-packages (from transformers[torch]~=4.19->dynamicprompts[attentiongrabber,magicprompt]~=0.30.4) (2.28.2)
Requirement already satisfied: tokenizers!=0.11.3,<0.14,>=0.11.1 in /usr/local/lib/python3.10/dist-packages (from transformers[torch]~=4.19->dynamicprompts[attentiongrabber,magicprompt]~=0.30.4) (0.13.3)
Requirement already satisfied: tqdm>=4.27 in /usr/local/lib/python3.10/dist-packages (from transformers[torch]~=4.19->dynamicprompts[attentiongrabber,magicprompt]~=0.30.4) (4.66.2)
Requirement already satisfied: torch!=1.12.0,>=1.9 in /usr/local/lib/python3.10/dist-packages (from transformers[torch]~=4.19->dynamicprompts[attentiongrabber,magicprompt]~=0.30.4) (2.2.1+cu121)
Requirement already satisfied: fsspec in /usr/local/lib/python3.10/dist-packages (from huggingface-hub<1.0,>=0.11.0->transformers[torch]~=4.19->dynamicprompts[attentiongrabber,magicprompt]~=0.30.4) (2023.6.0)
Requirement already satisfied: typing-extensions>=3.7.4.3 in /usr/local/lib/python3.10/dist-packages (from huggingface-hub<1.0,>=0.11.0->transformers[torch]~=4.19->dynamicprompts[attentiongrabber,magicprompt]~=0.30.4) (4.10.0)
Requirement already satisfied: sympy in /usr/local/lib/python3.10/dist-packages (from torch!=1.12.0,>=1.9->transformers[torch]~=4.19->dynamicprompts[attentiongrabber,magicprompt]~=0.30.4) (1.12)
Requirement already satisfied: networkx in /usr/local/lib/python3.10/dist-packages (from torch!=1.12.0,>=1.9->transformers[torch]~=4.19->dynamicprompts[attentiongrabber,magicprompt]~=0.30.4) (3.2.1)
Collecting nvidia-cuda-nvrtc-cu12==12.1.105 (from torch!=1.12.0,>=1.9->transformers[torch]~=4.19->dynamicprompts[attentiongrabber,magicprompt]~=0.30.4)
  Downloading nvidia_cuda_nvrtc_cu12-12.1.105-py3-none-manylinux1_x86_64.whl (23.7 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 23.7/23.7 MB 39.5 MB/s eta 0:00:00
Collecting nvidia-cuda-runtime-cu12==12.1.105 (from torch!=1.12.0,>=1.9->transformers[torch]~=4.19->dynamicprompts[attentiongrabber,magicprompt]~=0.30.4)
  Downloading nvidia_cuda_runtime_cu12-12.1.105-py3-none-manylinux1_x86_64.whl (823 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 823.6/823.6 kB 24.5 MB/s eta 0:00:00
Collecting nvidia-cuda-cupti-cu12==12.1.105 (from torch!=1.12.0,>=1.9->transformers[torch]~=4.19->dynamicprompts[attentiongrabber,magicprompt]~=0.30.4)
  Downloading nvidia_cuda_cupti_cu12-12.1.105-py3-none-manylinux1_x86_64.whl (14.1 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 14.1/14.1 MB 59.3 MB/s eta 0:00:00
Collecting nvidia-cudnn-cu12==8.9.2.26 (from torch!=1.12.0,>=1.9->transformers[torch]~=4.19->dynamicprompts[attentiongrabber,magicprompt]~=0.30.4)
  Downloading nvidia_cudnn_cu12-8.9.2.26-py3-none-manylinux1_x86_64.whl (731.7 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 731.7/731.7 MB 1.0 MB/s eta 0:00:00
Collecting nvidia-cublas-cu12==12.1.3.1 (from torch!=1.12.0,>=1.9->transformers[torch]~=4.19->dynamicprompts[attentiongrabber,magicprompt]~=0.30.4)
  Downloading nvidia_cublas_cu12-12.1.3.1-py3-none-manylinux1_x86_64.whl (410.6 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 410.6/410.6 MB 3.6 MB/s eta 0:00:00
Collecting nvidia-cufft-cu12==11.0.2.54 (from torch!=1.12.0,>=1.9->transformers[torch]~=4.19->dynamicprompts[attentiongrabber,magicprompt]~=0.30.4)
  Downloading nvidia_cufft_cu12-11.0.2.54-py3-none-manylinux1_x86_64.whl (121.6 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 121.6/121.6 MB 8.2 MB/s eta 0:00:00
Collecting nvidia-curand-cu12==10.3.2.106 (from torch!=1.12.0,>=1.9->transformers[torch]~=4.19->dynamicprompts[attentiongrabber,magicprompt]~=0.30.4)
  Downloading nvidia_curand_cu12-10.3.2.106-py3-none-manylinux1_x86_64.whl (56.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 56.5/56.5 MB 11.1 MB/s eta 0:00:00
Collecting nvidia-cusolver-cu12==11.4.5.107 (from torch!=1.12.0,>=1.9->transformers[torch]~=4.19->dynamicprompts[attentiongrabber,magicprompt]~=0.30.4)
  Downloading nvidia_cusolver_cu12-11.4.5.107-py3-none-manylinux1_x86_64.whl (124.2 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 124.2/124.2 MB 8.5 MB/s eta 0:00:00
Collecting nvidia-cusparse-cu12==12.1.0.106 (from torch!=1.12.0,>=1.9->transformers[torch]~=4.19->dynamicprompts[attentiongrabber,magicprompt]~=0.30.4)
  Downloading nvidia_cusparse_cu12-12.1.0.106-py3-none-manylinux1_x86_64.whl (196.0 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 196.0/196.0 MB 7.0 MB/s eta 0:00:00
Collecting nvidia-nccl-cu12==2.19.3 (from torch!=1.12.0,>=1.9->transformers[torch]~=4.19->dynamicprompts[attentiongrabber,magicprompt]~=0.30.4)
  Downloading nvidia_nccl_cu12-2.19.3-py3-none-manylinux1_x86_64.whl (166.0 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 166.0/166.0 MB 7.7 MB/s eta 0:00:00
Collecting nvidia-nvtx-cu12==12.1.105 (from torch!=1.12.0,>=1.9->transformers[torch]~=4.19->dynamicprompts[attentiongrabber,magicprompt]~=0.30.4)
  Downloading nvidia_nvtx_cu12-12.1.105-py3-none-manylinux1_x86_64.whl (99 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 99.1/99.1 kB 13.4 MB/s eta 0:00:00
Collecting triton==2.2.0 (from torch!=1.12.0,>=1.9->transformers[torch]~=4.19->dynamicprompts[attentiongrabber,magicprompt]~=0.30.4)
  Downloading triton-2.2.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (167.9 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 167.9/167.9 MB 6.4 MB/s eta 0:00:00
Collecting nvidia-nvjitlink-cu12 (from nvidia-cusolver-cu12==11.4.5.107->torch!=1.12.0,>=1.9->transformers[torch]~=4.19->dynamicprompts[attentiongrabber,magicprompt]~=0.30.4)
  Downloading nvidia_nvjitlink_cu12-12.4.99-py3-none-manylinux2014_x86_64.whl (21.1 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 21.1/21.1 MB 65.8 MB/s eta 0:00:00
Requirement already satisfied: charset-normalizer<4,>=2 in /usr/local/lib/python3.10/dist-packages (from requests->transformers[torch]~=4.19->dynamicprompts[attentiongrabber,magicprompt]~=0.30.4) (3.1.0)
Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.10/dist-packages (from requests->transformers[torch]~=4.19->dynamicprompts[attentiongrabber,magicprompt]~=0.30.4) (3.6)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /usr/local/lib/python3.10/dist-packages (from requests->transformers[torch]~=4.19->dynamicprompts[attentiongrabber,magicprompt]~=0.30.4) (1.26.12)
Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.10/dist-packages (from requests->transformers[torch]~=4.19->dynamicprompts[attentiongrabber,magicprompt]~=0.30.4) (2024.2.2)
Requirement already satisfied: mpmath>=0.19 in /usr/local/lib/python3.10/dist-packages (from sympy->torch!=1.12.0,>=1.9->transformers[torch]~=4.19->dynamicprompts[attentiongrabber,magicprompt]~=0.30.4) (1.3.0)
Installing collected packages: triton, nvidia-nvtx-cu12, nvidia-nvjitlink-cu12, nvidia-nccl-cu12, nvidia-curand-cu12, nvidia-cufft-cu12, nvidia-cuda-runtime-cu12, nvidia-cuda-nvrtc-cu12, nvidia-cuda-cupti-cu12, nvidia-cublas-cu12, nvidia-cusparse-cu12, nvidia-cudnn-cu12, dynamicprompts, nvidia-cusolver-cu12
  Attempting uninstall: triton
    Found existing installation: triton 2.1.0
    Uninstalling triton-2.1.0:
      Successfully uninstalled triton-2.1.0
  Attempting uninstall: dynamicprompts
    Found existing installation: dynamicprompts 0.29.0
    Uninstalling dynamicprompts-0.29.0:
      Successfully uninstalled dynamicprompts-0.29.0
Successfully installed dynamicprompts-0.30.4 nvidia-cublas-cu12-12.1.3.1 nvidia-cuda-cupti-cu12-12.1.105 nvidia-cuda-nvrtc-cu12-12.1.105 nvidia-cuda-runtime-cu12-12.1.105 nvidia-cudnn-cu12-8.9.2.26 nvidia-cufft-cu12-11.0.2.54 nvidia-curand-cu12-10.3.2.106 nvidia-cusolver-cu12-11.4.5.107 nvidia-cusparse-cu12-12.1.0.106 nvidia-nccl-cu12-2.19.3 nvidia-nvjitlink-cu12-12.4.99 nvidia-nvtx-cu12-12.1.105 triton-2.2.0

akx added a commit to akx/dynamicprompts that referenced this issue Mar 21, 2024
@akx akx closed this as completed in #746 Mar 21, 2024
@akx
Copy link
Collaborator

akx commented Mar 21, 2024

@PaoloPgn Okay, please upgrade the extension to the latest main version – should fix your issue (which has to do with the Transformers transitive dependency trying to install another version of Torch).

@PaoloPgn
Copy link
Author

I'm in Google Colab; I started stable diffusion, upgraded via the Extensions tab > check for updates > Apply and restart, I restarted Stable Diffusion on Colab, but the same error messages persist that requires the version 0.30.4. I have to uninstall completely the extension and reinstall from the beginning?

@PaoloPgn
Copy link
Author

PaoloPgn commented Mar 22, 2024

Today after 13 minutes was still downloading... there is a way to launch a command that install just the missing package and not everything else?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants