[Bugfix] Fix MLA attention crash with AWQ/GPTQ quantized models#34695
Merged
MatthewBonanni merged 3 commits intovllm-project:mainfrom Mar 13, 2026
Merged
[Bugfix] Fix MLA attention crash with AWQ/GPTQ quantized models#34695MatthewBonanni merged 3 commits intovllm-project:mainfrom
MatthewBonanni merged 3 commits intovllm-project:mainfrom