[ROCm][FP8][Kernel] FP8 quantization fused into Custom Paged Attention#17139
Merged
vllm-bot merged 5 commits intovllm-project:mainfrom May 7, 2025
Merged
[ROCm][FP8][Kernel] FP8 quantization fused into Custom Paged Attention#17139vllm-bot merged 5 commits intovllm-project:mainfrom
vllm-bot merged 5 commits intovllm-project:mainfrom
Commits
Commits on Apr 24, 2025
Commits on Apr 25, 2025
- committed
Commits on May 1, 2025
Commits on May 5, 2025
- committed