Skip to content

[src,scripts,egs] Add form of dropout that shares the mask across frames.#2244

Merged
danpovey merged 16 commits intokaldi-asr:masterfrom
danpovey:dropout-general
Mar 1, 2018
Merged

[src,scripts,egs] Add form of dropout that shares the mask across frames.#2244
danpovey merged 16 commits intokaldi-asr:masterfrom
danpovey:dropout-general

Conversation

@danpovey
Copy link
Contributor

@danpovey danpovey commented Mar 1, 2018

No description provided.

@danpovey
Copy link
Contributor Author

danpovey commented Mar 1, 2018

Just a note:
I was hoping by now to also have recipes with dropout for WSJ and Switchboard. I thought I was getting improvements from it on those setups, as well as mini_librispeech, but experiments @GaofengCheng has done seem to show that most or all of that improvement was due to other tuning changes.
We may update those recipes soon though.

For now this type of dropout seems to mostly be useful for very small-data setups, like mini-librispeech. @freewym, I have included a "block-dim" configuration value in this component that should be useful in applying it to image recognition setups (where it should be set to the num-filters)... this would be basically like GroupOut. It would be good if you could test that.

@freewym
Copy link
Contributor

freewym commented Mar 1, 2018 via email

@danpovey danpovey merged commit 9a4ba5e into kaldi-asr:master Mar 1, 2018
LvHang pushed a commit to LvHang/kaldi that referenced this pull request Apr 14, 2018
…mes. (kaldi-asr#2244)

Conflicts:
	egs/mini_librispeech/s5/local/chain/run_tdnn.sh
	egs/wsj/s5/local/chain/tuning/run_tdnn_1f.sh
	egs/wsj/s5/steps/nnet3/chain/train.py
	src/chain/chain-training.cc
	src/nnet3/nnet-chain-training.cc
	src/nnet3/nnet-compute.cc
	src/nnet3/nnet-normalize-component.h
	src/nnet3/nnet-training.cc
Skaiste pushed a commit to Skaiste/idlak that referenced this pull request Sep 26, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants