Skip to content

Commit d27df4d

Browse files
authored
Fix t5 forward pass (#1082)
This was noticed on #900, but we should probably get the fix into the forward pass without waiting on checkpoints.
1 parent d241dd0 commit d27df4d

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

keras_nlp/models/t5/t5_transformer_layer.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -123,7 +123,7 @@ def call(
123123
x = self.layer_norm(x)
124124
if self.use_gated_activation:
125125
hidden_activation = self.input_projector(x)
126-
hidden_linear = self.gate_projector(hidden_states)
126+
hidden_linear = self.gate_projector(x)
127127
x = hidden_activation * hidden_linear
128128
else:
129129
x = self.input_projector(x)

0 commit comments

Comments
 (0)