feat: Allow serializing and deserializing with Tensorizer without passing --model-loader-extra-config#8
Merged
sangstar merged 11 commits intosangstar/tensorizer-aws-update-and-any-kwargsfrom Jun 12, 2025
Conversation
…el files Signed-off-by: Sanger Steel <sangersteel@gmail.com>
Some changes to `TensorizerConfig` have added a few parameters that are used for convenience internally, but are exposed as public parameters. This unnecessarily complicates `TensorizerConfig` as it makes it seem like these are important parameters users need to understand and contend with to use `TensorizerConfig` with the public-facing API. They have been made private, so users can disregard them and have less parameters to wrap their heads around. Signed-off-by: Sanger Steel <sangersteel@gmail.com>
Signed-off-by: Sanger Steel <sangersteel@gmail.com>
Simply call `snapshot_download` to a tempdir and serialize that to S3 for model artifacts, completely decoupling Tensorizer from the original machinery needed to load specific files. Signed-off-by: Sanger Steel <sangersteel@gmail.com>
Signed-off-by: Sanger Steel <sangersteel@gmail.com>
Signed-off-by: Sanger Steel <sangersteel@gmail.com>
arsenetar
reviewed
Jun 6, 2025
Signed-off-by: Sanger Steel <sangersteel@gmail.com>
Signed-off-by: Sanger Steel <sangersteel@gmail.com>
Since `model_loader_extra_config` can be a `TensorizerConfig` instance as well as a dict, add a `__getitem__` method to `TensorizerConfig` and fix checker function to work without importing `TensorizerConfig` (that would've caused a circular import) Signed-off-by: Sanger Steel <sangersteel@gmail.com>
Eta0
approved these changes
Jun 12, 2025
Eta0
suggested changes
Jun 12, 2025
Eta0
left a comment
There was a problem hiding this comment.
Requested a few changes when reviewing this in a Zoom call, will approve when those are fixed.
Signed-off-by: Sanger Steel <sangersteel@gmail.com>
Eta0
approved these changes
Jun 12, 2025
Signed-off-by: Sanger Steel <sangersteel@gmail.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Serializing and deserializing with Tensorizer just with AWS config/credential files or env vars
This PR streamlines Tensorizer usage patterns with vLLM by allowing Tensorizer to handle model loading (and saving) from vLLM just by passing
load_format=tensorizer.As a basic demonstrating of what this looks like, the targets being run in this
Makefiledemonstrate the ways Tensorizer can be used to now save and load models without passing things like--model-loader-extra-configJSON strings.Please hold off on any requests for changes relating to formatting. This will all be done in a later stage with vLLM's formatter.