-
Notifications
You must be signed in to change notification settings - Fork 580
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Prepare for release 0.26 #2579
Prepare for release 0.26 #2579
Conversation
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@LysandreJik's favourite type of PR with 600+ lines less to maintain! 😄 Removing cached_download
is a big milestone. It shouldn't cause too many breaks in the wild given we've already took care of opening some PRs + deprecating it on a long cycle. Let's keep an eye open once it's released though :)
Thanks @hanouticelina for the cleaning. Let's wait for @LysandreJik review before merging
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Very happy to see this cleaned up!
thanks for the reviews! let's merge it 🧹 |
Hi! I'm here because it looks like removing the cache_download function in the newest release broke the AWS Sagemaker Huggingface Inference Toolkit. I'm using the
|
Hi @njbrake thanks for reporting and sorry for the inconvenience. I escalated this internally to see what can be done to fix it. |
@njbrake could you share what you are doing? Do you have a |
Hi, thanks all for the quick response. In my docker container, I have the following requirements.txt which updates the dependencies I care about. I am working in NLP and not using the diffusers library for anything, so I don't do anything to specify that it needs to be upgraded.
In order to get around the current situation it sounds like I can either pin the huggingface_hub version or enforce that the diffusers library gets upgraded? It seems weird to need to specify a version of the diffusers library when I don't use the diffusers library for anything in my application, it's only getting pulled in as a dependency pulled in by the SM HF inference toolkit. Thanks! |
We are looking into this but it seems an update of transformers leads to a change of |
PR following the 0.25.0 release to prepare for next one.
Main Changes
url_to_filename
,filename_to_url
andcached_download
functions.legacy_cache_layout
argument fromhf_hub_download
.cached_download
tests andlegacy_cache_layout
test.deprecate_positional_args
fromInferenceClient.__init__
andAsyncInferenceClient.__init__
, This means enforcing keyword-only arguments for all parameters aftermodel
is now effective!