-
Notifications
You must be signed in to change notification settings - Fork 35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Zero-One Loss in Classification #18
Comments
Hi, thanks for adding this! It looks reasonable to me and I can't think of anything else you would need to consider when implementing it. The existing modules should make it easy to swap in a new loss function like this. Submitting a PR would be great, if you're willing to. Would you also consider adding a notebook, or an example with this loss function to one of the existing notebooks? Just so we can verify that it yields reasonable results. |
@iancovert Thanks for your feedback. 🎉 I'd love to contribute this feature. I'll also prepare a notebook and maybe some tests. It will take me some time, as I'm working on my thesis. |
That sounds great! And no rush, I'm also working on my thesis so I understand :) |
Sorry it took me a while, but I just merged your PR. I checked out the code and it all looks reasonable, thanks for the contribution and for using the package! |
Thanks for this awesome library. 💯
In #7 you discussed the use of alternative loss functions in classification.
I'm working on a use case, where I perform classification with different classifiers, but I mainly care about the accuracy of the prediction, rather than the predicted probabilities, as some classifiers yield hard probabilities only. As such, I wanted to swap the cross-entropy loss for the zero-one loss.
I extended the
utils.py
(see here.) and added the new loss function:and call it like this:
I was wondering, if there is more to consider, when using alternative loss functions, in particular zero-one-loss, with SAGE?
Would you also be interested in a PR?
The text was updated successfully, but these errors were encountered: