-
Notifications
You must be signed in to change notification settings - Fork 809
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
No local packages or working download links found for tokenizers==0.12.1.dev0 #1036
Comments
@McPatate is working on enabling prebuilt packages. But currently they don't exist afaik. You can build the python package using
(You also need to have the rust compiler installed https://www.rust-lang.org/learn/get-started) and it should work. |
Should get something running in the coming weeks :) |
Ah I see thanks for the heads up! When I try building the python package using the code you supplied I'm once again faced with the Mac M1 error:
From #712 I gathered that building the package locally should prevent this issue. Do you by any chance know why I might be seeing this error? |
Sorry I am not an expert in Mac ARM since I don't own one of those beasts. I do know some people were able to build it though. Letting better informed ppl chime in. |
@KMFODA you are using an I had to do some ninja tricks to get it to work : python3 -m pip install setuptools_rust
git clone [email protected]:huggingface/tokenizers.git
cd tokenizers/bindings/python
python3 setup.py install
python3 -m pip install transformers
rm -rf /path/to/venv/lib/python3.x/site-packages/tokenizers
cp -R /path/to/venv/lib/python3.x/site-packages/tokenizers-x.x.x-py3.x-macosx-11-arm64.egg/tokenizers /path/to/venv/lib/python3.x/site-packages/tokenizers |
Thanks @McPatate really appreciate the wizardry. Sorry for only trying this now. I just tried following all the steps and I get blocked on the last step as there is no folder following the format tokenizers-x.x.x-py3.x-macosx-11-arm64.egg. I just have |
Would using |
amazing |
I get the following Error when installing tokenisers from source (as I'm on Macbook's M1 so can't install using pip I believe):
python-version:
python 3.10
The text was updated successfully, but these errors were encountered: