[Bug] [ROCm] Fix AiterFAMetadata Import Error when AITER is not installed#23106
[Bug] [ROCm] Fix AiterFAMetadata Import Error when AITER is not installed#23106tjtanaa wants to merge 1 commit intovllm-project:mainfrom
Conversation
Signed-off-by: tjtanaa <tunjian.tan@embeddedllm.com>
There was a problem hiding this comment.
Code Review
The pull request aims to fix an ImportError for the ROCm AITer FlashAttention backend when the aiter package is not installed. The change correctly adds a check for the aiter package before attempting to import it. However, this fix is incomplete and can lead to a runtime error if the backend is selected for use without aiter being installed. My review includes a comment explaining the issue and suggesting a more robust fix by making all aiter-dependent class definitions conditional, which is a common pattern in this codebase for handling optional dependencies.
| _PARTITION_SIZE_ROCM = 256 | ||
|
|
||
| if current_platform.is_rocm(): | ||
| if current_platform.is_rocm() and find_spec("aiter"): |
There was a problem hiding this comment.
While this change correctly prevents an ImportError at module load time if aiter is not installed, it can lead to a RuntimeError later.
The classes AiterFlashAttentionMetadata, AiterFlashAttentionMetadataBuilder, AiterFlashAttentionBackend, and AiterFlashAttentionImpl are defined later in this file (from line 207 onwards) and they depend on the aiter package and the custom ops registered within this if block.
If aiter is not installed, this if block is skipped, but the classes are still defined. When AiterFlashAttentionBackend is used, it will try to call ops that were not registered, leading to a crash.
To fix this properly, all the definitions that depend on aiter should be moved inside this if block. This includes lines 204 through 547. This will ensure that the AiterFlashAttentionBackend is only available when aiter is installed, preventing runtime errors.
A similar pattern is used for other attention backends with optional dependencies, like flash_attn.
|
👋 Hi! Thank you for contributing to the vLLM project. 💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels. Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging. To run CI, PR reviewers can either: Add 🚀 |
Purpose
Simple fix to the issue raised in #22795
Test Plan
Test Result
(Optional) Documentation Update
Essential Elements of an Effective PR Description Checklist
supported_models.mdandexamplesfor a new model.