Skip to content

Change 2cta opt in to have min seqlen > 2*m_block_size#2320

Merged
drisspg merged 1 commit intomainfrom
drisspg/stack/27
Mar 9, 2026
Merged

Change 2cta opt in to have min seqlen > 2*m_block_size#2320
drisspg merged 1 commit intomainfrom
drisspg/stack/27

Conversation

@drisspg
Copy link
Copy Markdown
Collaborator

@drisspg drisspg commented Mar 9, 2026

Stacked PRs:


Change 2cta opt in to have min seqlen > 2*m_block_size

@drisspg drisspg force-pushed the drisspg/stack/27 branch from ce5b2c5 to 50cb152 Compare March 9, 2026 18:26
drisspg added a commit to drisspg/flash-attention that referenced this pull request Mar 9, 2026
Comment thread flash_attn/cute/interface.py Outdated
@drisspg drisspg requested a review from jayhshah March 9, 2026 18:29
stack-info: PR: #2320, branch: drisspg/stack/27
@drisspg drisspg marked this pull request as draft March 9, 2026 19:17
@drisspg drisspg force-pushed the drisspg/stack/27 branch from 50cb152 to 30ce8d2 Compare March 9, 2026 19:17
@drisspg drisspg marked this pull request as ready for review March 9, 2026 19:17
@drisspg drisspg merged commit 42c5765 into main Mar 9, 2026
5t4r1i9ht pushed a commit to 5t4r1i9ht/flash-attention that referenced this pull request Mar 15, 2026
NJX-njx pushed a commit to NJX-njx/flash-attention that referenced this pull request Mar 28, 2026
@drisspg drisspg deleted the drisspg/stack/27 branch March 31, 2026 02:30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants