Skip to content
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions tests/test_vllm_flash_attn.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,10 @@
# This file is copied verbatim from vLLM:
# https://github.com/vllm-project/vllm/blob/main/tests/kernels/test_flash_attn.py
#
# How do I run or build flash attention for this test? I get the following error when trying to run the test_varlen_with_paged_kv.
# FAILED tests/test_vllm_flash_attn.py::test_varlen_with_paged_kv[True-2-False-32768-None-dtype0-None-16-128-num_heads0-seq_lens0] - AttributeError: '_OpNamespace' '_vllm_fa2_C' object has no attribute 'varlen_fwd'
# FAILED tests/test_vllm_flash_attn.py::test_varlen_with_paged_kv[False-2-False-32768-None-dtype0-None-16-128-num_heads0-seq_lens0] - AttributeError: '_OpNamespace' '_vllm_fa2_C' object has no attribute 'varlen_fwd'
#

import math
from typing import List, Optional, Tuple
Expand Down