Skip to content
This repository was archived by the owner on Jul 7, 2023. It is now read-only.

Commit 139b676

Browse files
rllin-fathomafrozenator
authored andcommitted
Update universal_transformer.py (#1192)
1 parent 288f46c commit 139b676

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

tensor2tensor/models/research/universal_transformer.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -416,7 +416,7 @@ def update_hparams_for_universal_transformer(hparams):
416416
# LSTM forget bias for lstm style recurrence.
417417
hparams.add_hparam("lstm_forget_bias", 1.0)
418418
# Uses the memory at the last step as the final output, if true.
419-
hparams.add_hparam("use_memory_as_final_state", True)
419+
hparams.add_hparam("use_memory_as_final_state", False)
420420
# if also add a ffn unit to the transition function when using gru/lstm
421421
hparams.add_hparam("add_ffn_unit_to_the_transition_function", False)
422422

0 commit comments

Comments
 (0)