Skip to content
This repository has been archived by the owner on Jan 15, 2024. It is now read-only.

Commit

Permalink
fix in_channels of LN (#1177)
Browse files Browse the repository at this point in the history
  • Loading branch information
zheyuye authored Feb 25, 2020
1 parent a785e6f commit bb2db3a
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion scripts/language_model/model/qa.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ def __init__(self, units=768, is_eval=False, prefix=None, params=None):
with self.name_scope():
self.dense_0 = nn.Dense(units, activation='tanh', flatten=False)
self.dense_1 = nn.Dense(1, flatten=False)
self.layernorm = nn.LayerNorm(epsilon=1e-12, in_channels=768)
self.layernorm = nn.LayerNorm(epsilon=1e-12, in_channels=units)

def __call__(self,
hidden_states,
Expand Down

0 comments on commit bb2db3a

Please sign in to comment.