Skip to content

Conversation

@vimalmanohar
Copy link
Contributor

No description provided.

@danpovey
Copy link
Contributor

Thanks a lot... I'm afraid I am working on a bunch of other changes that may conflict with this so, so I think it's better if we wait and have you commit this in a day or two. I probably won't need the lda-related code as I think already resolved that issue everywhere, but the other part is relevant.

@danpovey
Copy link
Contributor

OK I have committed those other changes I was talking about now. you'll probably have conflicts.

@danpovey
Copy link
Contributor

@vimalmanohar, can you please try to resolve the conflicts in this PR?

@danpovey
Copy link
Contributor

@vimalmanohar, sorry, I dropped the ball on this, and new conflicts have appeared. Would you mind resolving them?

# use during decoding
common_train_lib.copy_egs_properties_to_exp_dir(egs_dir, args.dir)

if (args.stage <= -3) and os.path.exists(args.dir+"/configs/init.config"):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are there other reasons we use init.config other than to make the LDA-like transform?


if (args.stage <= -3) and os.path.exists(args.dir+"/configs/init.config"):
add_lda = common_train_lib.is_lda_added(config_dir)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are there other reasons we use init.config other than to make the LDA-like transform?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Probably not. It seems like to be removed in the transfer learning PR.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

init.config is still used to create the initial model. The check is needed to know if the LDA needs to be trained.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

did you see my comment:
"I think you are mistaken, if we are talking about the current kaldi_52
code; init.config is only used if we are doing the LDA thing."
can you please update the PR?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I do not understand what needs to be done. The current solution is needed to know if LDA needs to be trained. The function is_lda_added can be changed to read init.raw if needed.

@danpovey
Copy link
Contributor

danpovey commented May 26, 2017 via email

@danpovey
Copy link
Contributor

danpovey commented May 26, 2017 via email

@danpovey
Copy link
Contributor

danpovey commented May 29, 2017 via email

@danpovey danpovey changed the base branch from kaldi_52 to master May 30, 2017 22:52
@danpovey
Copy link
Contributor

Closing this as I'm working on a version of this myself, along with certain other code cleanups.

@danpovey danpovey closed this May 30, 2017
danpovey added a commit to danpovey/kaldi that referenced this pull request May 30, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants