-
Notifications
You must be signed in to change notification settings - Fork 500
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
About the exact iteration number of LJSpeech pretrained model #153
Comments
Sorry for the confusion, but the name doesn't suggest the model was trained for 320k steps in total. I had noted for this before:
Ref: #129 I initially thought the notice was enough for the explanation but it seems not true as I got the same issue twice. I will update the filename of the pretrained model to avoid the confusion. |
Thanks for the quick response and hard work. Now I know i have to keep on trainning. Hope anyone who share the same confusion with me can see this post! |
Thanks for this explanation. However I'm still puzzled after load
and If I train a new modle base on Don't these mean that |
Line 13 in 474e411
|
Emm...Maybe I didn't express myself clearly. I just wanna confirm why the step continues at 320k rather than 1000k, since you said it was trained over 1000k steps... |
For example,
Then, model 2 was trained in total for 1000k steps, but the checkpoint only keep the last number of iterations (i.e. 300 ksteps) |
Gotcha! Thanks for your patient reply. |
Hi all:
I download pretrained model 20180510_mixture_lj_checkpoint_step000320000_ema.pth. As it's name suggested, it is trained 320k steps. I tried synthesising audios from it, and the quality is very good.
Then I train my own model on LJSpeech using the same preset as the author's. It is now running over 620k but there are little noise in the background. Here is my log:


So I am wondering what's wrong with my trainning. After some reading, I found some contradictory explaination in README.md.
For example,
It says this model is trained over 1000k, but its filename suggests that it's trained only 320k.
And in #1 (comment) , the author also mentioned he trained the model over 1000k.
SO, what is the exact iteration number of this 20180510_mixture_lj_checkpoint_step000320000_ema.pth ?.
This is important cause I am trying to figure out if there is something wrong with my own training. If the pretrained model is 320k and my trainning is definately wrong so I can start debugging.
The text was updated successfully, but these errors were encountered: