Skip to content
This repository was archived by the owner on Jul 7, 2023. It is now read-only.

Commit 21394f1

Browse files
phamthuonghailukaszkaiser
authored andcommitted
Should not generate summary during decoding in dot_product_relative_atention (#1618)
1 parent 2bc2189 commit 21394f1

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

tensor2tensor/layers/common_attention.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1745,7 +1745,7 @@ def dot_product_attention_relative(q,
17451745
save_weights_to[scope.name] = weights
17461746
save_weights_to[scope.name + "/logits"] = logits
17471747
weights = tf.nn.dropout(weights, 1.0 - dropout_rate)
1748-
if not tf.get_variable_scope().reuse and make_image_summary:
1748+
if not tf.get_variable_scope().reuse and common_layers.should_generate_summaries() and make_image_summary:
17491749
attention_image_summary(weights, image_shapes)
17501750
return _relative_attention_inner(weights, v, relations_values, False)
17511751

0 commit comments

Comments
 (0)