Skip to content

[Bugfix][V1][ROCm] Fix AITER Flash Attention Backend (Fix API Break and Local Attention Logic: affecting Llama4)#19904

Merged
houseroad merged 4 commits intovllm-project:mainfrom
EmbeddedLLM:fix-llama4-aiterfa
Jun 26, 2025
Merged

[Bugfix][V1][ROCm] Fix AITER Flash Attention Backend (Fix API Break and Local Attention Logic: affecting Llama4)#19904
houseroad merged 4 commits intovllm-project:mainfrom
EmbeddedLLM:fix-llama4-aiterfa

Commits

Commits on Jun 20, 2025

Commits on Jun 22, 2025

Commits on Jun 26, 2025