Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Cherry-pick] Use scaled_dot_product_attention in Wav2vec2/HuBERT's SelfAttention (#3253) #3261

Merged
merged 1 commit into from
Apr 11, 2023

Conversation

nateanl
Copy link
Member

@nateanl nateanl commented Apr 10, 2023

Summary:
Replace the attention computation with torch.nn.functional.scaled_dot_product_attention to improve running efficiency.

Pull Request resolved: #3253

Reviewed By: mthrok

Differential Revision: D44800353

Pulled By: nateanl

fbshipit-source-id: 41550d868c809099aadbe812b0ebe2c38121efb8

…ytorch#3253)

Summary:
Replace the attention computation with `torch.nn.functional.scaled_dot_product_attention` to improve running efficiency.

Pull Request resolved: pytorch#3253

Reviewed By: mthrok

Differential Revision: D44800353

Pulled By: nateanl

fbshipit-source-id: 41550d868c809099aadbe812b0ebe2c38121efb8
@nateanl nateanl requested a review from a team April 10, 2023 23:52
@nateanl nateanl merged commit e99de15 into pytorch:release/2.0 Apr 11, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants