Skip to content

[BugFix] bugfix for Flash Attention MLA with full cuda graph IMA following pr-25490#27128

Merged
LucasWilkinson merged 5 commits intovllm-project:mainfrom
Daisy-Ma-coder:flash_attn_mla_ima_fix
Oct 22, 2025
Merged

[BugFix] bugfix for Flash Attention MLA with full cuda graph IMA following pr-25490#27128
LucasWilkinson merged 5 commits intovllm-project:mainfrom
Daisy-Ma-coder:flash_attn_mla_ima_fix

Commits

Commits on Oct 17, 2025

Commits on Oct 18, 2025

Commits on Oct 22, 2025