Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prepare for release 0.26 #2579

Merged
merged 1 commit into from
Oct 1, 2024
Merged

Prepare for release 0.26 #2579

merged 1 commit into from
Oct 1, 2024

Conversation

hanouticelina
Copy link
Contributor

@hanouticelina hanouticelina commented Sep 30, 2024

PR following the 0.25.0 release to prepare for next one.

Main Changes

  • bump version to 0.26.0.dev0
  • remove url_to_filename, filename_to_url and cached_download functions.
  • remove legacy_cache_layout argument from hf_hub_download.
  • remove cached_download tests and legacy_cache_layout test.
  • remove deprecate_positional_args from InferenceClient.__init__ and AsyncInferenceClient.__init__, This means enforcing keyword-only arguments for all parameters after model is now effective!

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Copy link
Contributor

@Wauplin Wauplin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@LysandreJik's favourite type of PR with 600+ lines less to maintain! 😄 Removing cached_download is a big milestone. It shouldn't cause too many breaks in the wild given we've already took care of opening some PRs + deprecating it on a long cycle. Let's keep an eye open once it's released though :)

Thanks @hanouticelina for the cleaning. Let's wait for @LysandreJik review before merging

Copy link
Member

@LysandreJik LysandreJik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very happy to see this cleaned up!

@hanouticelina
Copy link
Contributor Author

thanks for the reviews! let's merge it 🧹

@hanouticelina hanouticelina merged commit 833bdc5 into main Oct 1, 2024
19 checks passed
@hanouticelina hanouticelina deleted the prepare-for-0.26 branch October 1, 2024 13:21
@njbrake
Copy link

njbrake commented Oct 18, 2024

Hi! I'm here because it looks like removing the cache_download function in the newest release broke the AWS Sagemaker Huggingface Inference Toolkit.

I'm using the huggingface-pytorch-inference:2.1.0-transformers4.37.0-cpu-py310-ubuntu22.04 container which as far as I can tell is the newest container available.

https://github.com/aws/deep-learning-containers/blob/master/available_images.md#huggingface-inference-containers

Traceback (most recent call last):
  File "/usr/local/bin/dockerd-entrypoint.py", line 21, in <module>
    from sagemaker_huggingface_inference_toolkit import serving
  File "/opt/conda/lib/python3.10/site-packages/sagemaker_huggingface_inference_toolkit/serving.py", line 18, in <module>
    from sagemaker_huggingface_inference_toolkit import handler_service, mms_model_server
  File "/opt/conda/lib/python3.10/site-packages/sagemaker_huggingface_inference_toolkit/handler_service.py", line 28, in <module>
    from sagemaker_huggingface_inference_toolkit.transformers_utils import (
  File "/opt/conda/lib/python3.10/site-packages/sagemaker_huggingface_inference_toolkit/transformers_utils.py", line 26, in <module>
    from sagemaker_huggingface_inference_toolkit.diffusers_utils import get_diffusers_pipeline, is_diffusers_available
  File "/opt/conda/lib/python3.10/site-packages/sagemaker_huggingface_inference_toolkit/diffusers_utils.py", line 32, in <module>
    from diffusers import AutoPipelineForText2Image, DPMSolverMultistepScheduler, StableDiffusionPipeline
  File "/opt/conda/lib/python3.10/site-packages/diffusers/__init__.py", line 5, in <module>
    from .utils import (
  File "/opt/conda/lib/python3.10/site-packages/diffusers/utils/__init__.py", line 38, in <module>
    from .dynamic_modules_utils import get_class_from_dynamic_module
  File "/opt/conda/lib/python3.10/site-packages/diffusers/utils/dynamic_modules_utils.py", line 28, in <module>
    from huggingface_hub import HfFolder, cached_download, hf_hub_download, model_info
ImportError: cannot import name 'cached_download' from 'huggingface_hub' (/opt/conda/lib/python3.10/site-packages/huggingface_hub/__init__.py)

@Wauplin
Copy link
Contributor

Wauplin commented Oct 18, 2024

Hi @njbrake thanks for reporting and sorry for the inconvenience. I escalated this internally to see what can be done to fix it.
The error comes from the fact that diffusers tries to import cached_download which has been removed in huggingface_hub. However, we've already updated diffusers before to avoid this issue (see huggingface/diffusers#8419, released in diffusers 0.29.0). So updating diffusers to a recent version should fix it. Could you tell us which version of diffusers is installed in the container you are using?

@philschmid
Copy link
Member

@njbrake could you share what you are doing? Do you have a requirements.txt which installs some dependencies?

@njbrake
Copy link

njbrake commented Oct 18, 2024

Hi, thanks all for the quick response. In my docker container, I have the following requirements.txt which updates the dependencies I care about. I am working in NLP and not using the diffusers library for anything, so I don't do anything to specify that it needs to be upgraded.

torch==2.0
transformers==4.41.2
nltk==3.8.1
spacy==3.4.4
peft==0.12.0

In order to get around the current situation it sounds like I can either pin the huggingface_hub version or enforce that the diffusers library gets upgraded? It seems weird to need to specify a version of the diffusers library when I don't use the diffusers library for anything in my application, it's only getting pulled in as a dependency pulled in by the SM HF inference toolkit.

Thanks!

@philschmid
Copy link
Member

We are looking into this but it seems an update of transformers leads to a change of huggingface_hub, which removed a method which is used in a other library. As quick workaround can you add huggingface_hub<0.26.0 to your requirements.txt.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants