tests: skip sliding window + fp8 to prevent hang in fmha_v2 unit tests#2781
tests: skip sliding window + fp8 to prevent hang in fmha_v2 unit tests#2781bkryu merged 3 commits intoflashinfer-ai:mainfrom
Conversation
Summary of ChangesHello, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request addresses an issue where Highlights
🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console. Changelog
Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for GitHub and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
|
No actionable comments were generated in the recent review. 🎉 ℹ️ Recent review info⚙️ Run configurationConfiguration used: defaults Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (1)
🚧 Files skipped from review as they are similar to previous changes (1)
📝 WalkthroughWalkthroughAdds module- and runtime-level test skip conditions to Changes
Estimated code review effort🎯 1 (Trivial) | ⏱️ ~5 minutes Suggested reviewers
Poem
🚥 Pre-merge checks | ✅ 1 | ❌ 2❌ Failed checks (2 warnings)
✅ Passed checks (1 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
📝 Coding Plan
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment Tip You can customize the high-level summary generated by CodeRabbit.Configure the |
There was a problem hiding this comment.
Code Review
This pull request temporarily disables sliding window tests in fmha_v2 unit tests to prevent hangs. The changes involve adding pytest.skip for cases where mask_mode == "SLIDING_WINDOW". My review focuses on the implementation of these skips. I've identified a case of code redundancy where a new, general skip condition makes a previous, more specific one obsolete, and I've suggested a way to resolve this to improve code maintainability.
| if mask_mode == "SLIDING_WINDOW": | ||
| pytest.skip("todo(jimmyzho): temporarily skip sliding window test due to hang") |
There was a problem hiding this comment.
This new if statement makes the preceding check for mask_mode == "SLIDING_WINDOW" on lines 831-839 redundant, creating dead code. To avoid this, you can modify the condition to exclude the case that is already handled by the previous if block. This keeps both pytest.skip calls active for their respective conditions and makes the code easier to maintain.
if mask_mode == "SLIDING_WINDOW" and not (
batch_size == 16
and num_kv_heads == 4
and head_dim == 256
and dtype == torch.float8_e4m3fn
and input_layout in ["PACKED_QKV", "CONTIGUOUS_Q_KV"]
):
pytest.skip("todo(jimmyzho): temporarily skip sliding window test due to hang")|
/bot run |
There was a problem hiding this comment.
🧹 Nitpick comments (1)
tests/attention/test_fmha_v2_prefill.py (1)
831-841: Redundant skip condition: fp8-specific check is now dead code.The new unconditional skip at lines 840-841 for all
SLIDING_WINDOWcases makes the fp8-specific skip block (lines 831-839) unreachable—sincemask_mode == "SLIDING_WINDOW"will always trigger the new skip first.Consider removing the dead code block or consolidating if you intend to restore the fp8-specific skip once the hang is fixed.
Also, the skip message mentions "todo" but lacks an issue tracker reference. Consider adding a GitHub issue link for tracking when the hang will be addressed.
Proposed consolidation
- # skip bs=16, q_heads=4, kv_heads=4, head_dim=256, dtype=float8_e4m3fn if packed/contiguous and sliding window due to bug - if ( - batch_size == 16 - and num_kv_heads == 4 - and head_dim == 256 - and dtype == torch.float8_e4m3fn - and input_layout in ["PACKED_QKV", "CONTIGUOUS_Q_KV"] - and mask_mode == "SLIDING_WINDOW" - ): - pytest.skip("Skip due to bug in fp8 sliding window") if mask_mode == "SLIDING_WINDOW": - pytest.skip("todo(jimmyzho): temporarily skip sliding window test due to hang") + pytest.skip("Sliding window causes hang (see issue `#XXXX`)")🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@tests/attention/test_fmha_v2_prefill.py` around lines 831 - 841, The first fp8-specific skip block (checking batch_size, num_kv_heads, head_dim, dtype == torch.float8_e4m3fn, input_layout, mask_mode == "SLIDING_WINDOW") is dead because the subsequent unconditional if mask_mode == "SLIDING_WINDOW": pytest.skip(...) always triggers; remove the redundant fp8-specific condition or consolidate both into a single skip that preserves the fp8-specific message and references (e.g., include the torch.float8_e4m3fn check in the consolidated skip text if you plan to restore it later), and update the skip message to include a GitHub issue link for tracking instead of “todo(jimmyzho)”.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In `@tests/attention/test_fmha_v2_prefill.py`:
- Around line 831-841: The first fp8-specific skip block (checking batch_size,
num_kv_heads, head_dim, dtype == torch.float8_e4m3fn, input_layout, mask_mode ==
"SLIDING_WINDOW") is dead because the subsequent unconditional if mask_mode ==
"SLIDING_WINDOW": pytest.skip(...) always triggers; remove the redundant
fp8-specific condition or consolidate both into a single skip that preserves the
fp8-specific message and references (e.g., include the torch.float8_e4m3fn check
in the consolidated skip text if you plan to restore it later), and update the
skip message to include a GitHub issue link for tracking instead of
“todo(jimmyzho)”.
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 7bced305-87d2-4336-ae6f-b9b5bca9c106
📒 Files selected for processing (1)
tests/attention/test_fmha_v2_prefill.py
|
/bot run |
|
/bot stop |
|
The GitLab CI pipeline #46071495 has been cancelled. |
|
/bot run |
|
[FAILED] Pipeline #46083975: 8/20 passed |
flashinfer-ai#2781) <!-- .github/pull_request_template.md --> ## 📌 Description <!-- What does this PR do? Briefly describe the changes and why they’re needed. --> ## 🔍 Related Issues <!-- Link any related issues here --> ## 🚀 Pull Request Checklist Thank you for contributing to FlashInfer! Before we review your pull request, please make sure the following items are complete. ### ✅ Pre-commit Checks - [ ] I have installed `pre-commit` by running `pip install pre-commit` (or used your preferred method). - [ ] I have installed the hooks with `pre-commit install`. - [ ] I have run the hooks manually with `pre-commit run --all-files` and fixed any reported issues. > If you are unsure about how to set up `pre-commit`, see [the pre-commit documentation](https://pre-commit.com/). ## 🧪 Tests - [ ] Tests have been added or updated as needed. - [ ] All tests are passing (`unittest`, etc.). ## Reviewer Notes <!-- Optional: anything you'd like reviewers to focus on, concerns, etc. --> <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit * **Tests** * Added module-level and runtime skips to several attention prefill tests to avoid hangs or flakiness for sliding-window mask scenarios and specific FP8 dtype combinations. <!-- end of auto-generated comment: release notes by coderabbit.ai --> --------- Co-authored-by: Brian Ryu <bryu@nvidia.com>
flashinfer-ai#2781) <!-- .github/pull_request_template.md --> ## 📌 Description <!-- What does this PR do? Briefly describe the changes and why they’re needed. --> ## 🔍 Related Issues <!-- Link any related issues here --> ## 🚀 Pull Request Checklist Thank you for contributing to FlashInfer! Before we review your pull request, please make sure the following items are complete. ### ✅ Pre-commit Checks - [ ] I have installed `pre-commit` by running `pip install pre-commit` (or used your preferred method). - [ ] I have installed the hooks with `pre-commit install`. - [ ] I have run the hooks manually with `pre-commit run --all-files` and fixed any reported issues. > If you are unsure about how to set up `pre-commit`, see [the pre-commit documentation](https://pre-commit.com/). ## 🧪 Tests - [ ] Tests have been added or updated as needed. - [ ] All tests are passing (`unittest`, etc.). ## Reviewer Notes <!-- Optional: anything you'd like reviewers to focus on, concerns, etc. --> <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit * **Tests** * Added module-level and runtime skips to several attention prefill tests to avoid hangs or flakiness for sliding-window mask scenarios and specific FP8 dtype combinations. <!-- end of auto-generated comment: release notes by coderabbit.ai --> --------- Co-authored-by: Brian Ryu <bryu@nvidia.com> Signed-off-by: Amey Naik <212485788+ameynaik-hub@users.noreply.github.com>
📌 Description
🔍 Related Issues
🚀 Pull Request Checklist
Thank you for contributing to FlashInfer! Before we review your pull request, please make sure the following items are complete.
✅ Pre-commit Checks
pre-commitby runningpip install pre-commit(or used your preferred method).pre-commit install.pre-commit run --all-filesand fixed any reported issues.🧪 Tests
unittest, etc.).Reviewer Notes
Summary by CodeRabbit