Skip to content

fix(fmha_v2): enable flash_attention for Q_PAGED_KV regardless of s_kv#3106

Open
blake-snc wants to merge 1 commit intoflashinfer-ai:mainfrom
blake-snc:fix/fmha-v2-q-paged-kv-flash-attention-gate
Open

fix(fmha_v2): enable flash_attention for Q_PAGED_KV regardless of s_kv#3106
blake-snc wants to merge 1 commit intoflashinfer-ai:mainfrom
blake-snc:fix/fmha-v2-q-paged-kv-flash-attention-gate

Commits

Commits on Apr 17, 2026