-
-
Notifications
You must be signed in to change notification settings - Fork 11.6k
Closed as not planned
Labels
bugSomething isn't workingSomething isn't workingstaleOver 90 days of inactivityOver 90 days of inactivity
Description
Your current environment
Via @hackey:
I am using:
ROCM (Dual AMD 7900 xtx)
Ubuntu 24.04
🐛 Describe the bug
See #6479 (comment)
Specifically this part:
registry.py:321] from vllm.attention.backends.flash_attn import FlashAttentionMetadata ERROR 02-21 11:17:10 registry.py:321] File "/usr/local/lib/python3.12/dist-packages/vllm/attention/backends/flash_attn.py", line 25, in <module> ERROR 02-21 11:17:10 registry.py:321] from vllm.vllm_flash_attn import (flash_attn_varlen_func, ERROR 02-21 11:17:10 registry.py:321] ImportError: cannot import name 'flash_attn_varlen_func' from 'vllm.vllm_flash_attn' (unknown location) ERROR 02-21 11:17:10 registry.py:321] Traceback (most recent call last): File "/usr/local/bin/vllm", line 8, in <module> sys.exit(main()) ^^^^^^ File "/usr/local/lib/python3.12/dist-packages/vllm/entrypoints/cli/main.py", line 73, in main
It looks like the problem is caused by importing FlashAttentionMetadata in MambaMixer2, which pulls in vllm_flash_attn, which is unsupported on RoCM.
Before submitting a new issue...
- Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't workingstaleOver 90 days of inactivityOver 90 days of inactivity