Skip to content
This repository was archived by the owner on Jul 7, 2023. It is now read-only.

Commit eed1ccf

Browse files
twilightdemaafrozenator
authored andcommitted
Fix transformer_moe model has wrong logic in pre/postprocessing (#1233)
1 parent 2167370 commit eed1ccf

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

tensor2tensor/models/research/transformer_moe.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -93,8 +93,8 @@ def prepostprocess(fct):
9393
"""Apply processing and capture the extra loss."""
9494
@expert_utils.add_var_scope()
9595
def decorated(x, *args, **kwargs):
96-
x = dp_preprocess(x)
97-
y, loss = fct(x, *args, **kwargs)
96+
x_preprocessed = dp_preprocess(x)
97+
y, loss = fct(x_preprocessed, *args, **kwargs)
9898
cache["extra_loss"] += loss
9999
return dp_postprocess(x, y)
100100
return decorated

0 commit comments

Comments
 (0)