MeanAveragePrecision doesn't work as expected when using max_detection_thresholds != [1, 10, 100]
#2360
Labels
max_detection_thresholds != [1, 10, 100]
#2360
🐛 Bug
When using
MeanAveragePrecision
, the mARs aren't computed as expected whenmax_detection_thresholds
are not the "default" values of[1, 10, 100]
. E.g., when using [1, 10, 1000] (using thepycocotools
backend) or when using [1, 10, 100, 1000] (using thefaster_coco_eval
backend), still only mAR@1, mAR@10, and mAR@100 are computed.To Reproduce
Use this code snippet to reproduce the bug:
Running the code above results in the following print out
where we can clearly see that mAR@1000 has not been compuzted, irrespective of the used backend.
Expected behavior
I expect that I can compute mARs for the the given
max_detection_thresholds
, no matter what these thresholds actually are.Environment
Additional context
This was working fine for the
pycocotools
backend intorchmetrics
0.11.4. This even worked formax_detection_thresholds = [1, 10, 100, 10000]
, which apparently isn't possible anymore, because onlylen(max_detection_thresholds) == 3
is allowed when using thepycocotools
backend.The text was updated successfully, but these errors were encountered: