Metrics for object detection

摘自https://github.com/rafaelpadilla/Object-Detection-Metrics#different-competitions-different-metrics

True Positive, False Positive, False Negative and True Negative

True Positive (TP): A correct detection. Detection with IOU ≥ threshold

False Positive (FP): A wrong detection. Detection with IOU < threshold

False Negative (FN): A ground truth not detected

True Negative (TN): Does not apply. It would represent a corrected misdetection. In the object detection task there are many possible bounding boxes that should not be detected within an image. Thus, TN would be all possible bounding boxes that were corrrectly not detected (so many possible boxes within an image). That's why it is not used by the metrics.

threshold: depending on the metric, it is usually set to 50%, 75% or 95%.

Precision

Metrics for object detection

Recall

Metrics for object detection

Precision x Recall curve

The Precision x Recall curve is a good way to evaluate the performance of an object detector as the confidence is changed by plotting a curve for each object class.

An object detector of a particular class is considered good if its precision stays high as recall increases, which means that if you vary the confidence threshold, the precision and recall will still be high.

A poor object detector needs to increase the number of detected objects (increasing False Positives = lower precision) in order to retrieve all ground truth objects (high recall). That's why the Precision x Recall curve usually starts with high precision values, decreasing as recall increases. 

Average Precision

Another way to compare the performance of object detectors is to calculate the area under the curve (AUC) of the Precision x Recall curve. In practice AP is the precision averaged across all recall values between 0 and 1.

Currently, the interpolation performed by PASCAL VOC challenge uses all data points, rather than interpolating only 11 equally spaced points. 

11-point interpolation

The 11-point interpolation tries to summarize the shape of the Precision x Recall curve by averaging the precision at a set of eleven equally spaced recall levels [0, 0.1, 0.2, ... , 1]:

Metrics for object detection

with

Metrics for object detection

where Metrics for object detectionis the measured precision at recall Metrics for object detection.

Instead of using the precision observed at each point, the AP is obtained by interpolating the precision only at the 11 levels Metrics for object detectiontaking the maximum precision whose recall value is greater than Metrics for object detection.

Interpolating all points

Instead of interpolating only in the 11 equally spaced points, you could interpolate through all points in such way that:

Metrics for object detection

with

Metrics for object detection

where Metrics for object detection is the measured precision at recall Metrics for object detection.

In this case, instead of using the precision observed at only few points, the AP is now obtained by interpolating the precision at each levelMetrics for object detection taking the maximum precision whose recall value is greater or equal than Metrics for object detection. This way we calculate the estimated area under the curve.

An ilustrated example

Consider the detections below:

Metrics for object detection

There are 7 images with 15 ground truth objects representented by the green bounding boxes and 24 detected objects represented by the red bounding boxes. Each detected object has a confidence level and is identified by a letter (A,B,...,Y).

The following table shows the bounding boxes with their corresponding confidences. The last column identifies the detections as TP or FP. In this example a TP is considered if IOU Metrics for object detection 30%, otherwise it is a FP. By looking at the images above we can roughly tell if the detections are TP or FP.

Metrics for object detection

In some images there are more than one detection overlapping a ground truth (Images 2, 3, 4, 5, 6 and 7). For those cases the detection with the highest IOU is considered TP and the others are considered FP. 

The Precision x Recall curve is plotted by calculating the precision and recall values of the accumulated TP or FP detections. For this, first we need to order the detections by their confidences, then we calculate the precision and recall for each accumulated detection as shown in the table below:

Metrics for object detection

Plotting the precision and recall values we have the following Precision x Recall curve:

Metrics for object detection

Calculating the 11-point interpolation

Metrics for object detection

By applying the 11-point interpolation, we have:

Metrics for object detection
Metrics for object detection

Metrics for object detection

Calculating the interpolation performed in all points

Metrics for object detection

Looking at the plot above, we can divide the AUC into 4 areas (A1, A2, A3 and A4):

Metrics for object detection

Calculating the total area, we have the AP:

Metrics for object detection

Metrics for object detection
Metrics for object detection
Metrics for object detection
Metrics for object detection
Metrics for object detection

Metrics for object detection
Metrics for object detection