Skip to content

tests: skip sliding window + fp8 to prevent hang in fmha_v2 unit tests#2781

Merged
bkryu merged 3 commits intoflashinfer-ai:mainfrom
jimmyzho:fix-hang
Mar 14, 2026
Merged

tests: skip sliding window + fp8 to prevent hang in fmha_v2 unit tests#2781
bkryu merged 3 commits intoflashinfer-ai:mainfrom
jimmyzho:fix-hang

Conversation

@jimmyzho
Copy link
Contributor

@jimmyzho jimmyzho commented Mar 13, 2026

📌 Description

🔍 Related Issues

🚀 Pull Request Checklist

Thank you for contributing to FlashInfer! Before we review your pull request, please make sure the following items are complete.

✅ Pre-commit Checks

  • I have installed pre-commit by running pip install pre-commit (or used your preferred method).
  • I have installed the hooks with pre-commit install.
  • I have run the hooks manually with pre-commit run --all-files and fixed any reported issues.

If you are unsure about how to set up pre-commit, see the pre-commit documentation.

🧪 Tests

  • Tests have been added or updated as needed.
  • All tests are passing (unittest, etc.).

Reviewer Notes

Summary by CodeRabbit

  • Tests
    • Added module-level and runtime skips to several attention prefill tests to avoid hangs or flakiness for sliding-window mask scenarios and specific FP8 dtype combinations.

@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request addresses an issue where fmha_v2 unit tests using the SLIDING_WINDOW mask mode were hanging. To mitigate this, temporary skips have been introduced for these specific test cases, ensuring the test suite can complete without interruption while the underlying hang issue is investigated.

Highlights

  • Test Skipping: Implemented conditional pytest.skip for SLIDING_WINDOW mask mode in fmha_v2 prefill unit tests to prevent reported hangs.

🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console.

Changelog
  • tests/attention/test_fmha_v2_prefill.py
    • Added a conditional skip for SLIDING_WINDOW mask mode in test_trtllm_fmha_v2_prefill.
    • Added a conditional skip for SLIDING_WINDOW mask mode in test_trtllm_fmha_v2_prefill_attention_sinks.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for GitHub and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Mar 13, 2026

No actionable comments were generated in the recent review. 🎉

ℹ️ Recent review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: 023ce645-a503-4d10-bcda-bb96ac8614dd

📥 Commits

Reviewing files that changed from the base of the PR and between 6888c0b and 38a1003.

📒 Files selected for processing (1)
  • tests/attention/test_fmha_v2_prefill.py
🚧 Files skipped from review as they are similar to previous changes (1)
  • tests/attention/test_fmha_v2_prefill.py

📝 Walkthrough

Walkthrough

Adds module- and runtime-level test skip conditions to tests/attention/test_fmha_v2_prefill.py to bypass SLIDING_WINDOW mask mode and specific FP8 runtime scenarios that cause hangs.

Changes

Cohort / File(s) Summary
Test Skip Conditions
tests/attention/test_fmha_v2_prefill.py
Inserted a module-global skip marker; added runtime skips in test_trtllm_fmha_v2_prefill for mask_mode == "SLIDING_WINDOW" and for FP8 combinations (e.g., torch.float8_e4m3fn), and added a runtime skip in test_trtllm_fmha_v2_prefill_attention_sinks for mask_mode == "SLIDING_WINDOW".

Estimated code review effort

🎯 1 (Trivial) | ⏱️ ~5 minutes

Suggested reviewers

  • Anerudhan
  • yzh119
  • nvmbreughe

Poem

🐰 I skip a test with nimble feet,
When sliding masks and FP8 meet,
A little hop, the hang's avoided,
Green CI skies once more — joy unnoided,
Carrot cheers for tests complete!

🚥 Pre-merge checks | ✅ 1 | ❌ 2

❌ Failed checks (2 warnings)

Check name Status Explanation Resolution
Description check ⚠️ Warning The description contains only the unmodified PR template with no actual implementation details filled in; all sections are empty or contain only checklist items. Fill in the Description section explaining what changes were made and why (preventing hangs), and optionally reference related issues or any reviewer notes.
Docstring Coverage ⚠️ Warning Docstring coverage is 50.00% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (1 passed)
Check name Status Explanation
Title check ✅ Passed The title accurately describes the main change: adding skip markers for sliding window and fp8 cases in fmha_v2 tests to prevent hangs.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
📝 Coding Plan
  • Generate coding plan for human review comments

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Tip

You can customize the high-level summary generated by CodeRabbit.

Configure the reviews.high_level_summary_instructions setting to provide custom instructions for generating the high-level summary.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request temporarily disables sliding window tests in fmha_v2 unit tests to prevent hangs. The changes involve adding pytest.skip for cases where mask_mode == "SLIDING_WINDOW". My review focuses on the implementation of these skips. I've identified a case of code redundancy where a new, general skip condition makes a previous, more specific one obsolete, and I've suggested a way to resolve this to improve code maintainability.

Comment on lines +840 to +841
if mask_mode == "SLIDING_WINDOW":
pytest.skip("todo(jimmyzho): temporarily skip sliding window test due to hang")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

This new if statement makes the preceding check for mask_mode == "SLIDING_WINDOW" on lines 831-839 redundant, creating dead code. To avoid this, you can modify the condition to exclude the case that is already handled by the previous if block. This keeps both pytest.skip calls active for their respective conditions and makes the code easier to maintain.

    if mask_mode == "SLIDING_WINDOW" and not (
        batch_size == 16
        and num_kv_heads == 4
        and head_dim == 256
        and dtype == torch.float8_e4m3fn
        and input_layout in ["PACKED_QKV", "CONTIGUOUS_Q_KV"]
    ):
        pytest.skip("todo(jimmyzho): temporarily skip sliding window test due to hang")

@bkryu
Copy link
Collaborator

bkryu commented Mar 13, 2026

/bot run

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick comments (1)
tests/attention/test_fmha_v2_prefill.py (1)

831-841: Redundant skip condition: fp8-specific check is now dead code.

The new unconditional skip at lines 840-841 for all SLIDING_WINDOW cases makes the fp8-specific skip block (lines 831-839) unreachable—since mask_mode == "SLIDING_WINDOW" will always trigger the new skip first.

Consider removing the dead code block or consolidating if you intend to restore the fp8-specific skip once the hang is fixed.

Also, the skip message mentions "todo" but lacks an issue tracker reference. Consider adding a GitHub issue link for tracking when the hang will be addressed.

Proposed consolidation
-    # skip bs=16, q_heads=4, kv_heads=4, head_dim=256, dtype=float8_e4m3fn if packed/contiguous and sliding window due to bug
-    if (
-        batch_size == 16
-        and num_kv_heads == 4
-        and head_dim == 256
-        and dtype == torch.float8_e4m3fn
-        and input_layout in ["PACKED_QKV", "CONTIGUOUS_Q_KV"]
-        and mask_mode == "SLIDING_WINDOW"
-    ):
-        pytest.skip("Skip due to bug in fp8 sliding window")
     if mask_mode == "SLIDING_WINDOW":
-        pytest.skip("todo(jimmyzho): temporarily skip sliding window test due to hang")
+        pytest.skip("Sliding window causes hang (see issue `#XXXX`)")
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@tests/attention/test_fmha_v2_prefill.py` around lines 831 - 841, The first
fp8-specific skip block (checking batch_size, num_kv_heads, head_dim, dtype ==
torch.float8_e4m3fn, input_layout, mask_mode == "SLIDING_WINDOW") is dead
because the subsequent unconditional if mask_mode == "SLIDING_WINDOW":
pytest.skip(...) always triggers; remove the redundant fp8-specific condition or
consolidate both into a single skip that preserves the fp8-specific message and
references (e.g., include the torch.float8_e4m3fn check in the consolidated skip
text if you plan to restore it later), and update the skip message to include a
GitHub issue link for tracking instead of “todo(jimmyzho)”.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Nitpick comments:
In `@tests/attention/test_fmha_v2_prefill.py`:
- Around line 831-841: The first fp8-specific skip block (checking batch_size,
num_kv_heads, head_dim, dtype == torch.float8_e4m3fn, input_layout, mask_mode ==
"SLIDING_WINDOW") is dead because the subsequent unconditional if mask_mode ==
"SLIDING_WINDOW": pytest.skip(...) always triggers; remove the redundant
fp8-specific condition or consolidate both into a single skip that preserves the
fp8-specific message and references (e.g., include the torch.float8_e4m3fn check
in the consolidated skip text if you plan to restore it later), and update the
skip message to include a GitHub issue link for tracking instead of
“todo(jimmyzho)”.

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: 7bced305-87d2-4336-ae6f-b9b5bca9c106

📥 Commits

Reviewing files that changed from the base of the PR and between 74e99e8 and acd2662.

📒 Files selected for processing (1)
  • tests/attention/test_fmha_v2_prefill.py

@bkryu bkryu added the run-ci label Mar 13, 2026
@flashinfer-bot
Copy link
Collaborator

GitLab MR !416 has been created, and the CI pipeline #46070888 is currently running. I'll report back once the pipeline job completes.

@jimmyzho jimmyzho changed the title tests: skip sliding window to prevent hang in fmha_v2 unit tests tests: skip sliding window + fp8 to prevent hang in fmha_v2 unit tests Mar 13, 2026
@jimmyzho
Copy link
Contributor Author

/bot run

@flashinfer-bot
Copy link
Collaborator

GitLab MR !416 has been updated with latest changes, and the CI pipeline #46071495 is currently running. I'll report back once the pipeline job completes.

@bkryu
Copy link
Collaborator

bkryu commented Mar 13, 2026

/bot stop

@flashinfer-bot
Copy link
Collaborator

The GitLab CI pipeline #46071495 has been cancelled.

@bkryu
Copy link
Collaborator

bkryu commented Mar 13, 2026

/bot run

@flashinfer-bot
Copy link
Collaborator

GitLab MR !416 has been updated with latest changes, and the CI pipeline #46083975 is currently running. I'll report back once the pipeline job completes.

@flashinfer-bot
Copy link
Collaborator

[FAILED] Pipeline #46083975: 8/20 passed

@bkryu bkryu merged commit 46fd825 into flashinfer-ai:main Mar 14, 2026
30 of 31 checks passed
@coderabbitai coderabbitai bot mentioned this pull request Mar 18, 2026
5 tasks
frankwang28 pushed a commit to frankwang28/flashinfer that referenced this pull request Mar 18, 2026
flashinfer-ai#2781)

<!-- .github/pull_request_template.md -->

## 📌 Description

<!-- What does this PR do? Briefly describe the changes and why they’re
needed. -->

## 🔍 Related Issues

<!-- Link any related issues here -->

## 🚀 Pull Request Checklist

Thank you for contributing to FlashInfer! Before we review your pull
request, please make sure the following items are complete.

### ✅ Pre-commit Checks

- [ ] I have installed `pre-commit` by running `pip install pre-commit`
(or used your preferred method).
- [ ] I have installed the hooks with `pre-commit install`.
- [ ] I have run the hooks manually with `pre-commit run --all-files`
and fixed any reported issues.

> If you are unsure about how to set up `pre-commit`, see [the
pre-commit documentation](https://pre-commit.com/).

## 🧪 Tests

- [ ] Tests have been added or updated as needed.
- [ ] All tests are passing (`unittest`, etc.).

## Reviewer Notes

<!-- Optional: anything you'd like reviewers to focus on, concerns, etc.
-->


<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **Tests**
* Added module-level and runtime skips to several attention prefill
tests to avoid hangs or flakiness for sliding-window mask scenarios and
specific FP8 dtype combinations.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->

---------

Co-authored-by: Brian Ryu <bryu@nvidia.com>
ameynaik-hub pushed a commit to ameynaik-hub/flashinfer that referenced this pull request Mar 18, 2026
flashinfer-ai#2781)

<!-- .github/pull_request_template.md -->

## 📌 Description

<!-- What does this PR do? Briefly describe the changes and why they’re
needed. -->

## 🔍 Related Issues

<!-- Link any related issues here -->

## 🚀 Pull Request Checklist

Thank you for contributing to FlashInfer! Before we review your pull
request, please make sure the following items are complete.

### ✅ Pre-commit Checks

- [ ] I have installed `pre-commit` by running `pip install pre-commit`
(or used your preferred method).
- [ ] I have installed the hooks with `pre-commit install`.
- [ ] I have run the hooks manually with `pre-commit run --all-files`
and fixed any reported issues.

> If you are unsure about how to set up `pre-commit`, see [the
pre-commit documentation](https://pre-commit.com/).

## 🧪 Tests

- [ ] Tests have been added or updated as needed.
- [ ] All tests are passing (`unittest`, etc.).

## Reviewer Notes

<!-- Optional: anything you'd like reviewers to focus on, concerns, etc.
-->

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **Tests**
* Added module-level and runtime skips to several attention prefill
tests to avoid hangs or flakiness for sliding-window mask scenarios and
specific FP8 dtype combinations.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->

---------

Co-authored-by: Brian Ryu <bryu@nvidia.com>
Signed-off-by: Amey Naik <212485788+ameynaik-hub@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants