-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement standard attention and self-attention module #52
Milestone
Comments
This was referenced Nov 3, 2021
Merged
Merged
For this, I think we really should clarify the dim tags first (#17). |
albertz
added a commit
that referenced
this issue
Nov 6, 2021
Just for reference, this lacks positional encoding so far. Some relevant discussion: |
See also: Pytorch |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Via
cum_concat
etc.rwth-i6/returnn#391
rwth-i6/returnn#656
rwth-i6/returnn#589
rwth-i6/returnn#590
The text was updated successfully, but these errors were encountered: