-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
training a classifier should overwrite the .lex #484
Comments
@danyaljj do you have any comments on this? |
Just to clarify it, are you saying that training a model would write on disk (lexicon file), before/without calling |
No, with or without |
I see. So you think we should always remove lexicon file, at the beginning of |
I expected it to be overwritten by default, we need to indicate if we want to continue training or need to train from scratch. Because removing those at the beginning of the |
Right I agree it's tricky.
What do you think? |
Sounds good to me. @Rahgooy might have comments. |
I think it is good for training a single model, but when we want to train multiple models, let's say with a loop, in that case, the user should wait for the first model to train and then enter [Y/N]. IMO, the better option is to have it as a parameter or something. |
In fact for jointraining we have the |
It seems if the .lex of a classifier has been created before and exists in the default path when we retrain the classifiers it adds features to the same lexicon, that is, the lexicon is not overwritten.
(We need tests for load, save and when classifiers are created from scratch. related to #411 )
The text was updated successfully, but these errors were encountered: