-
Notifications
You must be signed in to change notification settings - Fork 124
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Initialization #4
Comments
btw, it seems you mul the same alpha to all examples, but to my understand, alpha should be different for pos and neg examples |
i explore the method in proposal + classification, not the regression(like ssd), the initialization is a method for early training to balanced the heavy inbalanced examples, but my goal is to compare the method with ohem in hards mining, the initialization is not used in my experiment, and the effect of alpha is equal to the effect of learning rate. if you have other idea about alpha, can you tell me how to? |
@unsky Thanks! I have done some experiment on regression framework, and I just mul (1-alpha) to pos examples, causing a tiny improvement though. |
@ruinmessi if you don't mind sharing, what were your conclusions on the comparison(OHEM vs focal loss, and how focal loss does on proposal+classification)? I'm curious about something similar.. |
What is your initialization of the detector? Is that exactly the same with original paper to set bias = -log((1-pi)/pi) or use normal softmax for several times like your previous claimed?
The text was updated successfully, but these errors were encountered: