You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
🤗 Transcribing... ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0:00:02You have passed task=transcribe, but also have set forced_decoder_ids to [[1, None], [2, 50360]] which creates a conflict. forced_decoder_ids will be ignored in favor of task=transcribe.
🤗 Transcribing... ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0:00:17Passing a tuple of past_key_values is deprecated and will be removed in Transformers v4.43.0. You should pass an instance of EncoderDecoderCache instead, e.g. past_key_values=EncoderDecoderCache.from_legacy_cache(past_key_values).
🤗 Transcribing... ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0:00:20The attention mask is not set and cannot be inferred from input because pad token is same as eos token. As a consequence, you may observe unexpected behavior. Please pass your input's attention_mask to obtain reliable results.
🤗 Transcribing... ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0:00:46
it's getting stuck here and transcribing forever
The text was updated successfully, but these errors were encountered:
I'm having the same issue. Command is insanely-fast-whisper --file-name test.mp3 --device-id mps --model-name distil-whisper/distil-large-v3 or insanely-fast-whisper --file-name test.mp3 --device-id mps --model-name openai/whisper-large-v3-turbo, with or without --batch-size 4
I've followed the instructions exactly but still can't transcribe audio file in a few minutes.
insanely-fast-whisper --file-name /[test].wav --device-id mps
🤗 Transcribing... ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0:00:02You have passed task=transcribe, but also have set
forced_decoder_ids
to [[1, None], [2, 50360]] which creates a conflict.forced_decoder_ids
will be ignored in favor of task=transcribe.🤗 Transcribing... ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0:00:17Passing a tuple of
past_key_values
is deprecated and will be removed in Transformers v4.43.0. You should pass an instance ofEncoderDecoderCache
instead, e.g.past_key_values=EncoderDecoderCache.from_legacy_cache(past_key_values)
.🤗 Transcribing... ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0:00:20The attention mask is not set and cannot be inferred from input because pad token is same as eos token. As a consequence, you may observe unexpected behavior. Please pass your input's
attention_mask
to obtain reliable results.🤗 Transcribing... ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0:00:46
it's getting stuck here and transcribing forever
The text was updated successfully, but these errors were encountered: