[Misc] compressed-tensors code reuse#7277
Conversation
|
👋 Hi! Thank you for contributing to the vLLM project. Once the PR is approved and ready to go, please make sure to run full CI as it is required to merge (or just use auto-merge). To run full CI, you can do one of these:
🚀 |
aaa041e to
8960860
Compare
compressed-tensors code reusecompressed-tensors code reuse
|
/ready |
|
@kylesayrs you're missing Line 20 in 5923532 |
|
Current state looks good so far. Biggest piece of feedback is that we are still rewriting the logic associated with parsing the It will be tricky to fix this (because the vLLM state_dict is not a 1:1 map with the transformers state_dict), so feel free to reach out if you need any pointers. |
|
@robertgshaw2-neuralmagic I think updating the |
|
These test failures seem unrelated to this PR? The a few seem to be cuda errors and one is complaining about bad llm metrics measurements |
Sounds good. @kylesayrs im just running this by simon but we should be good to go |
049dc9c to
ce29b08
Compare
This reverts commit 373538f.
Signed-off-by: Alvant <alvasian@yandex.ru>
Signed-off-by: LeiWang1999 <leiwang1999@outlook.com>
The reused classes are
CompressionFormatQuantizationArgsQuantizationStrategyQuantizationType