You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello Hiro, after I tained streaming transformer model(mma) on Librispeech corpus, I tried to decode the results by score.sh.
It seemed to go well at first, but the decoding went wrong after few steps, with the error message as followed:
# This part is just a warning.0%||0/2703 [00:00<?, ?it/s]/mnt/data1/jungwonchang/projects/neural_sp/neural_sp/models/modules/mocha.py:815: UserWarning: Thisoverloadofnonzeroisdeprecated:
nonzero()
Considerusingoneofthefollowingsignaturesinstead:
nonzero(*, boolas_tuple) (Triggeredinternallyat/opt/conda/conda-bld/pytorch_1595629427478/work/torch/csrc/utils/python_arg_parser.cpp:766.)
boundary=alpha[b, h, 0, 0].nonzero()[:, -1].min().item()
9%|▊ |232/2703 [15:06<1:49:35, 2.66s/it]Originalutterancenum: 2703Removed0emptyutterances# This part is where I get errorsTraceback (mostrecentcalllast):
File"/home/jungwonchang/projects1/neural_sp/examples/librispeech/s5/../../../neural_sp/bin/asr/eval.py", line247, in<module>main()
File"/home/jungwonchang/projects1/neural_sp/examples/librispeech/s5/../../../neural_sp/bin/asr/eval.py", line182, inmainoracle=True)
File"/mnt/data1/jungwonchang/projects/neural_sp/neural_sp/evaluators/wordpiece.py", line85, ineval_wordpieceensemble_models=models[1:] iflen(models) >1else [])[0]
File"/mnt/data1/jungwonchang/projects/neural_sp/neural_sp/models/seq2seq/speech2text.py", line763, indecodeensmbl_eouts, ensmbl_elens, ensmbl_decs)
File"/mnt/data1/jungwonchang/projects/neural_sp/neural_sp/models/seq2seq/decoders/transformer.py", line896, inbeam_searchrightmost_frame=max(0, aws_last_success[0, :, 0].nonzero()[:, -1].max().item()) +1RuntimeError: operationdoesnothaveanidentity.
9%|▊ |232/2703 [15:07<2:41:09, 3.91s/it]
the configuration I used was conf/asr/mma/streaming/lc_transformer_mma_subsample8_ma4H_ca4H_w16_from4L_64_128_64.yaml
Also, I found out in the decode.log that the streamable feature for my model was False
Hello Hiro, after I tained streaming transformer model(mma) on Librispeech corpus, I tried to decode the results by score.sh.
It seemed to go well at first, but the decoding went wrong after few steps, with the error message as followed:
the configuration I used was
conf/asr/mma/streaming/lc_transformer_mma_subsample8_ma4H_ca4H_w16_from4L_64_128_64.yaml
Also, I found out in the decode.log that the
streamable
feature for my model wasFalse
Any idea or advice on this issue?
The text was updated successfully, but these errors were encountered: