Skip to content

Commit 1a3633a

Browse files
authored
[conformer] sdpa default to false (#2362)
1 parent 5dfc9dc commit 1a3633a

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

wenet/transformer/attention.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -230,7 +230,7 @@ def __init__(self,
230230
n_feat: int,
231231
dropout_rate: float,
232232
key_bias: bool = True,
233-
use_sdpa: bool = True):
233+
use_sdpa: bool = False):
234234
"""Construct an RelPositionMultiHeadedAttention object."""
235235
super().__init__(n_head, n_feat, dropout_rate, key_bias, use_sdpa)
236236
# linear transformation for positional encoding

0 commit comments

Comments
 (0)