Skip to content

Conversation

@przemekwitek
Copy link
Contributor

@przemekwitek przemekwitek commented Mar 12, 2020

This PR makes the comparison used in evaluation metrics more lenient.
Instead of comparing raw field values, it now compares their string representations so e.g.: actual field value "1" and predicted field value 1 are assumed the same.

Relates #53485

@przemekwitek przemekwitek force-pushed the fix_overall_accuracy branch 2 times, most recently from b75cbbe to b05927c Compare March 12, 2020 14:32
@przemekwitek przemekwitek changed the title Make accuracy evaluation metric work when there is field mapping type mismatch Make classification evaluation metrics work when there is field mapping type mismatch Mar 12, 2020
@przemekwitek przemekwitek removed the WIP label Mar 12, 2020
@przemekwitek przemekwitek marked this pull request as ready for review March 12, 2020 14:41
@elasticmachine
Copy link
Collaborator

Pinging @elastic/ml-core (:ml)

@przemekwitek przemekwitek force-pushed the fix_overall_accuracy branch from b05927c to 8e66ea3 Compare March 16, 2020 08:47
Copy link
Member

@benwtrent benwtrent left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

naming suggestion. But looks good.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants