Skip to content

Commit

Permalink
adapt npu FA. (#8171)
Browse files Browse the repository at this point in the history
  • Loading branch information
wuhuachaocoding authored Mar 25, 2024
1 parent a5d87f5 commit a0457d1
Showing 1 changed file with 1 addition and 0 deletions.
1 change: 1 addition & 0 deletions paddlenlp/transformers/llama/modeling.py
Original file line number Diff line number Diff line change
Expand Up @@ -239,6 +239,7 @@ def scaled_dot_product_attention(
attention_mask is None,
True,
False,
False,
)[0]
else:
attn_output = F.scaled_dot_product_attention(
Expand Down

0 comments on commit a0457d1

Please sign in to comment.