forked from Dao-AILab/flash-attention
-
Notifications
You must be signed in to change notification settings - Fork 55
Pull requests: vllm-project/flash-attention
Author
Label
Projects
Milestones
Reviews
Assignee
Sort
Pull requests list
Removed the assertion imposed on cu_seqlens_k and seqused_k
#59
opened Mar 29, 2025 by
chenyang78
Loading…
Add back flash_attn_func api (and support FA3) [Don't Merge Yet]
#40
opened Jan 26, 2025 by
LucasWilkinson
Loading…
ProTip!
Type g p on any issue or pull request to go back to the pull request listing page.