Skip to content

Commit

Permalink
Polish formats.
Browse files Browse the repository at this point in the history
  • Loading branch information
limin2021 committed Nov 9, 2021
1 parent d083224 commit 87b1bff
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion paddle/fluid/operators/fused/fused_attention_op.cc
Original file line number Diff line number Diff line change
Expand Up @@ -325,7 +325,7 @@ class FusedAttentionOpMaker : public framework::OpProtoAndCheckerMaker {
out = layer_norm(input);
out = compute_qkv(out) + bias;
// fmha module
{
{
out = transpose(out, perm=[2, 0, 3, 1, 4]);
out = q * k^t;
out = attn_mask + out;
Expand Down

0 comments on commit 87b1bff

Please sign in to comment.