-
Notifications
You must be signed in to change notification settings - Fork 6
feat: Add download acceleration for dependencies and HuggingFace models #83
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
- Add pydantic>=2.0.0 for enhanced protocol models - Version bump to 0.10.0 for download acceleration feature
- Add accelerate_downloads and hf_models_to_cache fields to FunctionRequest - Enhance Pydantic models with improved type annotations and documentation - Maintain backward compatibility with existing protocol - Support HuggingFace model pre-caching for faster inference startup
- Add accelerate_downloads and hf_models_to_cache parameters to @Remote decorator - Update function and class decoration to pass acceleration options - Extend docstring with comprehensive parameter documentation - Enable HuggingFace model pre-caching through decorator configuration
- Add acceleration parameters to create_remote_class function - Store acceleration settings in RemoteClassWrapper instances - Pass acceleration options through to remote execution requests - Maintain compatibility with existing class decoration patterns
- Extend prepare_request methods to accept acceleration parameters - Update request building to include new acceleration fields - Maintain consistency across execution pathways - Preserve existing stub interface contracts
- Update create_remote_class calls to include new acceleration parameters - Ensure all existing tests pass with enhanced function signatures - Add proper parameter defaults for backward compatibility - Maintain test coverage for class execution patterns
pandyamarut
approved these changes
Aug 18, 2025
Contributor
pandyamarut
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
/LGTM
Merged
pandyamarut
pushed a commit
that referenced
this pull request
Sep 9, 2025
…ls (#83) * feat: add pydantic dependency and bump to v0.10.0 - Add pydantic>=2.0.0 for enhanced protocol models - Version bump to 0.10.0 for download acceleration feature * feat: extend protobuf protocol for download acceleration - Add accelerate_downloads and hf_models_to_cache fields to FunctionRequest - Enhance Pydantic models with improved type annotations and documentation - Maintain backward compatibility with existing protocol - Support HuggingFace model pre-caching for faster inference startup * feat: implement download acceleration in client interface - Add accelerate_downloads and hf_models_to_cache parameters to @Remote decorator - Update function and class decoration to pass acceleration options - Extend docstring with comprehensive parameter documentation - Enable HuggingFace model pre-caching through decorator configuration * feat: update class execution system for download acceleration - Add acceleration parameters to create_remote_class function - Store acceleration settings in RemoteClassWrapper instances - Pass acceleration options through to remote execution requests - Maintain compatibility with existing class decoration patterns * feat: update stubs to support download acceleration parameters - Extend prepare_request methods to accept acceleration parameters - Update request building to include new acceleration fields - Maintain consistency across execution pathways - Preserve existing stub interface contracts * test: update tests for download acceleration compatibility - Update create_remote_class calls to include new acceleration parameters - Ensure all existing tests pass with enhanced function signatures - Add proper parameter defaults for backward compatibility - Maintain test coverage for class execution patterns
Closed
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Introduces support for download acceleration from Tetra runtime. This also speeds up remote execution startup times by pre-caching pip dependencies and HuggingFace models.
Includes new
accelerate_downloadsandhf_models_to_cacheparameters for the@remotedecorator with backward compatibility.Related to runpod-workers/worker-tetra#22