3

I have a question about how to interpret this mAP metric, for example, if an object detector gets a mAP of 48. Is this a good result? or if you have an AP of 50, is it different?

He went back to the IoU, could it be that if it is mAP50, it means that it detects the photos with an IoU of 50 (almost half), therefore the image would not detect everything, but could it give unreliable results?

Any help is welcome.

Sebastián
  • 437
  • 5
  • 19

1 Answers1

2

I suggest you to read this posts Intuition behind Average Precision and MAP[1] and Breaking Down Mean Average Precision (mAP), I find them very helpful. But, difference between a 48 or 50 AP, it's that 50 AP it's a litle better. Here's a little part of [1]:

"For example, suppose we are searching for images of a flower and we provide our image retrieval system a sample picture of a rose (query), we do get back a bunch of ranked images (from most likely to least likely). Usually not all of them are correct. So we compute the precision at every correctly returned image, and then take an average. If our returned result is

1, 0, 0, 1, 1, 1

then the precision at every correct point is: how many correct images have been encountered up to this point (including current) divided by the total images seen up to this point.

1/1, 0, 0, 2/4, 3/5, 4/6

where 1 is an image of a flower, while 0 no. The AP for above example is 0.6917.
A simple way to interpret is to produce a combination of zeros and ones which will give the required AP: An AP of 0.5 could have results like

0, 1, 0, 1, 0, 1, ...

where every second image is correct, while an AP of 0.333 has

0, 0, 1, 0, 0, 1, 0, 0, 1, ...

where every third image is correct..."

MrNobody33
  • 6,413
  • 7
  • 19
  • thanks for the explanation and references! It is clear to me but I am left with a question, If my mAP it is mAP75 refers to what will consider the good images that have a 75 of IoU? – Sebastián Jun 11 '20 at 06:32
  • 1
    @Sebastián if you use mAP@75 then it means that during the calculation of TP and FP for the precision-recall curve you take IoU = 0.75. This means that you take the more strict requirements for bounding boxes intersection (compared to IoU = 0.5) – dinarkino Dec 01 '21 at 10:37