Replies: 1 comment 4 replies
-
Thanks for the note. That's also something I noticed some time. I don't think it's a "bug" bug but more of a limitation of the method. Right now, I don't have a good solution for that though. At least not for this simple 0-1 decomposition. Maybe it could make sense to introduce flavor of this with a slightly different name |
Beta Was this translation helpful? Give feedback.
4 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Describe the bug
Total Loss != Bias^2 + Variance for '0-1_loss'
Steps/Code to Reproduce
Expected Results
Expecting Total Loss = Bias^2 + Variance
Actual Results
Versions
MLxtend 0.21.0dev
macOS-10.16-x86_64-i386-64bit
Python 3.8.12 | packaged by conda-forge | (default, Oct 12 2021, 21:50:38)
[Clang 11.1.0 ]
Scikit-learn 1.1.2
NumPy 1.23.2
SciPy 1.9.0
Beta Was this translation helpful? Give feedback.
All reactions