-
Notifications
You must be signed in to change notification settings - Fork 31.6k
Return attention mask in ASR pipeline to avoid warnings #33509
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
8d0d13b to
3e7b53e
Compare
|
cc: @ylacombe here |
|
Hey @Rocketknight1, thanks for opening the PR! |
|
@ylacombe there are some slow tests that I can't get working on my local machine (even on |
|
Let me know if you want me to run them! |
|
@ylacombe Sure! |
ylacombe
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So, I've first opened a PR to fix some of the slow tests that were not passing due to how data was loaded #33545.
Your PR doesn't add any failing tests as compared to main and the changes make sense, so I think it should be ok to merge !
|
Okay, cool! cc @LysandreJik for core maintainer review. |
LysandreJik
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the PR @Rocketknight1!
|
Hi @Rocketknight1, |
Hi, where can I set |
…33509) return attention mask in ASR pipeline
|
I'm having the error fixed here using latest transformers==4.47.1: when running this code: |
cc @sanchit-gandhi - this PR just sets
return_attention_mask=Trueon the preprocessors in theautomatic_speech_recognitionpipeline to avoid warnings caused by missing attention masks. It works okay in my testing, but please let me know if you think it could cause any problems!Fixes #33498