Skip to content
This repository has been archived by the owner on Jan 15, 2024. It is now read-only.

Commit

Permalink
fix transformer
Browse files Browse the repository at this point in the history
  • Loading branch information
zheyuye committed Jul 29, 2020
1 parent e49fbe1 commit d9c4140
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/gluonnlp/models/transformer.py
Original file line number Diff line number Diff line change
Expand Up @@ -196,7 +196,7 @@ def __init__(self,
bias_initializer=bias_initializer,
dtype=self._dtype)
attention_layout = 'NTK' if self._layout == 'NT' else 'TNK'
self.attention_cell =\
self.self_attention =\
MultiHeadAttentionCell(
query_units=self._units,
num_heads=self._num_heads,
Expand Down

0 comments on commit d9c4140

Please sign in to comment.