Skip trtllm_alltoall tests on Thor#2448
Conversation
Add module-level pytest skip for all tests in test_trtllm_alltoall.py when running on SM110 compute capability devices (Thor). These tests have known issues on this architecture. Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
Update comment and skip reason to explicitly state that these tests hang indefinitely on SM110 (Thor) devices rather than just having "known issues". Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
Summary of ChangesHello @dierksen, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request addresses a critical issue in the continuous integration pipeline where Highlights
🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console. Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
📝 WalkthroughWalkthroughA test skip marker was added to conditionally bypass all tests in the module when a CUDA device with compute capability 11 (SM110/Thor) is detected, using a new import from flashinfer.utils to prevent test hangs on specific hardware. Changes
Estimated code review effort🎯 1 (Trivial) | ⏱️ ~2 minutes Possibly related PRs
Suggested reviewers
Poem
🚥 Pre-merge checks | ✅ 3✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Code Review
This pull request disables the trtllm_alltoall tests on Thor (SM110) devices where they are known to hang. The change is correct and uses pytest.mark.skipif appropriately. I've added one suggestion to improve the robustness of the test skipping logic by also handling cases where CUDA is not available, which would currently lead to test failures instead of skips.
| # Skip all tests on SM110 (Thor) devices - these tests hang indefinitely on this architecture | ||
| pytestmark = pytest.mark.skipif( | ||
| torch.cuda.is_available() | ||
| and get_compute_capability(torch.device("cuda:0"))[0] == 11, | ||
| reason="Tests hang indefinitely on SM110 (Thor) devices", | ||
| ) |
There was a problem hiding this comment.
The current implementation correctly skips tests on Thor devices. However, if CUDA is not available, the skipif condition evaluates to false, causing the tests to run and subsequently fail because they require CUDA. It is better practice for tests to be skipped if their environmental requirements are not met.
This suggestion refactors the logic to also skip all tests in this file if CUDA is not available. This provides a clearer result for developers running tests in a non-GPU environment and makes the skipping logic more robust and easier to read.
| # Skip all tests on SM110 (Thor) devices - these tests hang indefinitely on this architecture | |
| pytestmark = pytest.mark.skipif( | |
| torch.cuda.is_available() | |
| and get_compute_capability(torch.device("cuda:0"))[0] == 11, | |
| reason="Tests hang indefinitely on SM110 (Thor) devices", | |
| ) | |
| # Skip all tests on SM110 (Thor) devices and if CUDA is not available | |
| _skip_reason = None | |
| if not torch.cuda.is_available(): | |
| _skip_reason = "CUDA not available, skipping trtllm_alltoall tests" | |
| elif get_compute_capability(torch.device("cuda:0"))[0] == 11: | |
| _skip_reason = "Tests hang indefinitely on SM110 (Thor) devices" | |
| pytestmark = pytest.mark.skipif(_skip_reason is not None, reason=_skip_reason) |
There was a problem hiding this comment.
CUDA will always be available when we test FlashInfer
|
/bot run |
|
[FAILED] Pipeline #42938127: 6/20 passed |
|
@flashinfer-bot run |
|
@flashinfer-bot rerun failed |
<!-- .github/pull_request_template.md --> ## 📌 Description After adding Thor to our internal CI, we've been seeing consistent hangs in this test suite. Since this functionality likely isn't crucial for this platform, we can disable the tests to unblock the rest of the testing and get nightly Thor test results. ## 🔍 Related Issues <!-- Link any related issues here --> ## 🚀 Pull Request Checklist Thank you for contributing to FlashInfer! Before we review your pull request, please make sure the following items are complete. ### ✅ Pre-commit Checks - [x] I have installed `pre-commit` by running `pip install pre-commit` (or used your preferred method). - [x] I have installed the hooks with `pre-commit install`. - [x] I have run the hooks manually with `pre-commit run --all-files` and fixed any reported issues. > If you are unsure about how to set up `pre-commit`, see [the pre-commit documentation](https://pre-commit.com/). ## 🧪 Tests - [ ] Tests have been added or updated as needed. - [ ] All tests are passing (`unittest`, etc.). ## Reviewer Notes <!-- Optional: anything you'd like reviewers to focus on, concerns, etc. --> <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit * **Tests** * Updated tests to skip execution on SM110/Thor CUDA devices to prevent hangs during test runs. <sub>✏️ Tip: You can customize this high-level summary in your review settings.</sub> <!-- end of auto-generated comment: release notes by coderabbit.ai --> --------- Co-authored-by: Claude Sonnet 4.5 <noreply@anthropic.com>
<!-- .github/pull_request_template.md --> ## 📌 Description After adding Thor to our internal CI, we've been seeing consistent hangs in this test suite. Since this functionality likely isn't crucial for this platform, we can disable the tests to unblock the rest of the testing and get nightly Thor test results. ## 🔍 Related Issues <!-- Link any related issues here --> ## 🚀 Pull Request Checklist Thank you for contributing to FlashInfer! Before we review your pull request, please make sure the following items are complete. ### ✅ Pre-commit Checks - [x] I have installed `pre-commit` by running `pip install pre-commit` (or used your preferred method). - [x] I have installed the hooks with `pre-commit install`. - [x] I have run the hooks manually with `pre-commit run --all-files` and fixed any reported issues. > If you are unsure about how to set up `pre-commit`, see [the pre-commit documentation](https://pre-commit.com/). ## 🧪 Tests - [ ] Tests have been added or updated as needed. - [ ] All tests are passing (`unittest`, etc.). ## Reviewer Notes <!-- Optional: anything you'd like reviewers to focus on, concerns, etc. --> <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit * **Tests** * Updated tests to skip execution on SM110/Thor CUDA devices to prevent hangs during test runs. <sub>✏️ Tip: You can customize this high-level summary in your review settings.</sub> <!-- end of auto-generated comment: release notes by coderabbit.ai --> --------- Co-authored-by: Claude Sonnet 4.5 <noreply@anthropic.com>
📌 Description
After adding Thor to our internal CI, we've been seeing consistent hangs in this test suite. Since this functionality likely isn't crucial for this platform, we can disable the tests to unblock the rest of the testing and get nightly Thor test results.
🔍 Related Issues
🚀 Pull Request Checklist
Thank you for contributing to FlashInfer! Before we review your pull request, please make sure the following items are complete.
✅ Pre-commit Checks
pre-commitby runningpip install pre-commit(or used your preferred method).pre-commit install.pre-commit run --all-filesand fixed any reported issues.🧪 Tests
unittest, etc.).Reviewer Notes
Summary by CodeRabbit
✏️ Tip: You can customize this high-level summary in your review settings.