Skip to content

Commit

Permalink
Fix mAP calculation for areas with 0 predictions (#1080)
Browse files Browse the repository at this point in the history
* Fix mAP calculation for areas with 0 predictions

The issue was first dicussed here: #1061 (comment)

* chlog

Co-authored-by: Jirka <[email protected]>
Co-authored-by: Daniel Stancl <[email protected]>

(cherry picked from commit 84274ab)
  • Loading branch information
23pointsNorth authored and Borda committed Jun 14, 2022
1 parent 24957b1 commit 659216b
Show file tree
Hide file tree
Showing 3 changed files with 65 additions and 5 deletions.
43 changes: 41 additions & 2 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,17 +7,56 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
**Note: we move fast, but still we preserve 0.1 version (one feature release) back compatibility.**


## [UnReleased] - 2022-MM-DD

### Added

-


-


### Changed

-

-


### Deprecated

-

-


### Removed

-

-


### Fixed

- Fixed mAP calculation for areas with 0 predictions ([#1080](https://github.com/PyTorchLightning/metrics/pull/1080))


-


## [0.9.1] - 2022-06-08

### Added

- Added specific `RuntimeError` when metric object is on wrong device ([#1056](https://github.com/PyTorchLightning/metrics/pull/1056))
- Added specific `RuntimeError` when metric object is on the wrong device ([#1056](https://github.com/PyTorchLightning/metrics/pull/1056))
- Added an option to specify own n-gram weights for `BLEUScore` and `SacreBLEUScore` instead of using uniform weights only. ([#1075](https://github.com/PyTorchLightning/metrics/pull/1075))

### Fixed

- Fixed aggregation metrics when input only contains zero ([#1070](https://github.com/PyTorchLightning/metrics/pull/1070))
- Fixed `TypeError` when providing superclass arguments as kwargs ([#1069](https://github.com/PyTorchLightning/metrics/pull/1069))
- Fixed `TypeError` when providing superclass arguments as `kwargs` ([#1069](https://github.com/PyTorchLightning/metrics/pull/1069))
- Fixed bug related to state reference in metric collection when using compute groups ([#1076](https://github.com/PyTorchLightning/metrics/pull/1076))


Expand Down
23 changes: 22 additions & 1 deletion tests/detection/test_map.py
Original file line number Diff line number Diff line change
Expand Up @@ -462,7 +462,7 @@ def test_missing_gt():


@pytest.mark.skipif(_pytest_condition, reason="test requires that torchvision=>0.8.0 is installed")
def test_segm_iou_empty_mask():
def test_segm_iou_empty_gt_mask():
"""Test empty ground truths."""
metric = MeanAveragePrecision(iou_type="segm")

Expand All @@ -482,6 +482,27 @@ def test_segm_iou_empty_mask():
metric.compute()


@pytest.mark.skipif(_pytest_condition, reason="test requires that torchvision=>0.8.0 is installed")
def test_segm_iou_empty_pred_mask():
"""Test empty predictions."""
metric = MeanAveragePrecision(iou_type="segm")

metric.update(
[
dict(
masks=torch.BoolTensor([]),
scores=Tensor([]),
labels=IntTensor([]),
),
],
[
dict(masks=torch.randint(0, 1, (1, 10, 10)).bool(), labels=IntTensor([4])),
],
)

metric.compute()


@pytest.mark.skipif(_pytest_condition, reason="test requires that torchvision=>0.8.0 is installed")
def test_error_on_wrong_input():
"""Test class input validation."""
Expand Down
4 changes: 2 additions & 2 deletions torchmetrics/detection/mean_ap.py
Original file line number Diff line number Diff line change
Expand Up @@ -482,9 +482,9 @@ def __evaluate_image_gt_no_preds(
) -> Dict[str, Any]:
"""Some GT but no predictions."""
# GTs
gt = gt[gt_label_mask]
gt = [gt[i] for i in gt_label_mask]
nb_gt = len(gt)
areas = box_area(gt)
areas = compute_area(gt, iou_type=self.iou_type).to(self.device)
ignore_area = (areas < area_range[0]) | (areas > area_range[1])
gt_ignore, _ = torch.sort(ignore_area.to(torch.uint8))
gt_ignore = gt_ignore.to(torch.bool)
Expand Down

0 comments on commit 659216b

Please sign in to comment.