[WIP] [egs] port tdnn_7m23t script to librispeech#2233
[WIP] [egs] port tdnn_7m23t script to librispeech#2233danpovey merged 3 commits intokaldi-asr:masterfrom
Conversation
| @@ -0,0 +1,246 @@ | |||
| #!/bin/bash | |||
|
|
|||
| ## Adapted from swbd for librispeech by David van Leeuwen | |||
There was a problem hiding this comment.
Would you mind redoing this based on the checked-in librispeech example in master? (I think it's 7n or something like that). I renamed some of the layers in a way that I consider clearer. And rename to run_tdnn_1c.sh and change the suffix to 1c; Also we need a comparison with the old results, produced by compare_wer.sh. It's betterif you actually run the old setup (in 1b) and compare with that; if not, you could try to fake it based on its compare_wer.sh output in its comment at the top.
There was a problem hiding this comment.
Sure, no problem I can do that. All it takes is time, since I have a small cluster.
This script has been re-modeled after run_tdnn_1b.sh. We still need to re-model the rnn training config according to swbd 1c, and do complete testing.
|
I re-modeled the body of the script to Don't merge yet---I still need to
|
|
You're OK to merge now. Re-ran the I can rebase and change subject to |
|
Thanks! Merging. |
This is the port of the tdnn_7m23t.sh script from swbd to librispeech. It took a while, since I wanted to verify all steps actually run. Still I added one/two things afterwards (e.g., the num_leaves parameter). I didn't make it to PR2114 in time, so therefore this is just a PR to master.
Results measured are: