You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, I am trying to load a model that has more than 25 files and am hitting this error:
Traceback (most recent call last):
File "/fsx/lewis/git/hf/h4/scripts/deployment/aimo/upload_model.py", line 77, in <module>
main()
File "/fsx/lewis/git/hf/h4/scripts/deployment/aimo/upload_model.py", line 68, in main
kagglehub.model_upload(kaggle_handle, local_dir)
File "/fsx/lewis/miniconda3/envs/h4/lib/python3.10/site-packages/kagglehub/models.py", line 53, in model_upload
create_model_instance_or_version(h, tokens, license_name, version_notes)
File "/fsx/lewis/miniconda3/envs/h4/lib/python3.10/site-packages/kagglehub/models_helpers.py", line 67, in create_model_instance_or_version
raise (e)
File "/fsx/lewis/miniconda3/envs/h4/lib/python3.10/site-packages/kagglehub/models_helpers.py", line 61, in create_model_instance_or_version
_create_model_instance(model_handle, files, license_name)
File "/fsx/lewis/miniconda3/envs/h4/lib/python3.10/site-packages/kagglehub/models_helpers.py", line 34, in _create_model_instance
api_client.post(f"/models/{model_handle.owner}/{model_handle.model}/create/instance", data)
File "/fsx/lewis/miniconda3/envs/h4/lib/python3.10/site-packages/kagglehub/clients.py", line 122, in post
process_post_response(response_dict)
File "/fsx/lewis/miniconda3/envs/h4/lib/python3.10/site-packages/kagglehub/exceptions.py", line 108, in process_post_response
raise BackendError(response["error"], error_code)
kagglehub.exceptions.BackendError: The file count exceeds the maximum of 25
It would be nice if this constraint could be relaxed since I typically shard my models into smaller files to speed up the model loading on Kaggle notebooks
The text was updated successfully, but these errors were encountered:
Hello, I am trying to load a model that has more than 25 files and am hitting this error:
It would be nice if this constraint could be relaxed since I typically shard my models into smaller files to speed up the model loading on Kaggle notebooks
The text was updated successfully, but these errors were encountered: