Main Content

objectDetectionMetrics

Object detection quality metrics

Since R2023b

    Description

    Use the objectDetectionMetrics object and its object functions to evaluate the quality of object detection results.

    An objectDetectionMetrics object stores object detection quality metrics, such as the average precision (AP) and precision recall, computed per class and per image. To compute the AP and precision recall metrics, pass the objectDetectionMetrics object to the averagePrecision or the precisionRecall object functions, respectively. To compute the confusion matrix, pass the objectDetectionMetrics object to the confusionMatrix object function. Evaluate the summary of all metrics across all classes and all images in the data set using the summarize object function.

    Creation

    Create an objectDetectionMetrics object by using the evaluateObjectDetection function.

    Properties

    expand all

    This property is read-only.

    Metrics per class, stored as a table with C rows, where C is the number of classes in the object detection. If additional metrics are not specified through the AdditionalMetrics argument of the evaluateObjectDetection function, the ClassMetrics table has five columns, corresponding to these object detection metrics.

    • NumObjects — Number of objects in the ground truth data for a class.

    • AP — Average precision (AP) for each class at each overlap threshold in OverlapThreshold, stored as a numThresh-by-1 array, where numThresh is the number of overlap thresholds.

    • APOverlapAvg — AP averaged over all overlap thresholds. Specify the overlap thresholds for a class using the threshold argument.

    • Precision — Precision values, stored as a numThresh-by-(numPredictions+1) matrix, where numPredictions is the number of predicted bounding boxes. Precision is the ratio of the number of true positives (TP) and the total number of predicted positives​.

      Precision = TP / (TP + FP)

      FP is the number of false positives. Larger precision scores imply that most detected objects match ground truth objects.

    • Recall — Recall values, stored as a numThresh-by-(numPredictions+1) matrix, where numPredictions is the number of predicted boxes. Recall is the ratio of the number of true positives (TP) and the number of groundtruths – the sum of true positives (TP) and false negatives (FN)​.

      Recall = TP / (TP + FN)

      FN is the number of false negatives. Larger recall scores indicate that more of ground truth objects are detected.

      Note

      For each overlap threshold (row in the Recall matrix), the recall values (columns of the Recall matrix) are sorted in the order of decreasing confidence score associated with each detection.

    For information on optional additional metrics for this table, see the AdditionalMetrics argument of the evaluateObjectDetection function.

    This property is read-only.

    Metrics per image in the data set, stored as a table with numImages rows, where numImages is the number of images in the data set. If additional metrics are not specified through the AdditionalMetrics argument of the evaluateObjectDetection function, the ImageMetrics table has three columns, corresponding to these object detection metrics.

    • NumObjects — Number of objects in the ground truth data in each image, stored as a positive integer.

    • mAP — Mean average precision (mAP), calculated by averaging the average precision (AP) across all classes at each overlap threshold in the OverlapThreshold property, stored as a numThresh-by-1 numeric vector. numThresh is the number of overlap thresholds. Specify the overlap thresholds for an image using the threshold argument.

    • mAPOverlapAvg — Mean average precision (mAP), calculated by averaging the AP across all classes and all overlap thresholds specified by the OverlapThreshold property, stored as a numeric scalar.

    For information on optional additional metrics for this table, see the AdditionalMetrics argument of the evaluateObjectDetection function.

    Class names of detected objects, stored as an array of strings or a cell array of character vectors.

    Example: {"sky"} {"grass"} {"building"} {"sidewalk"}

    Overlap threshold, stored as a numeric scalar or numeric vector of box overlap threshold values over which the mean average precision is computed. When the intersection over union (IoU) of the pixels in the ground truth bounding box and the predicted bounding box is equal to or greater than the overlap threshold, the detection is considered a match to the ground truth (true positive). The IoU is the number of pixels in the intersection of the bounding boxes divided by the number of pixels in the union of the bounding boxes.

    Object Functions

    averagePrecisionEvaluate average precision metric of object detection results
    confusionMatrixCompute confusion matrix of object detection results
    precisionRecallGet precision recall metrics of object detection results
    summarizeSummarize object detection performance metrics at data set and class level
    metricsByAreaEvaluate detection performance across object size ranges

    Examples

    collapse all

    Load a table containing images and ground truth bounding box labels. The first column contains the images, and the remaining columns contain the labeled bounding boxes.

    data = load("vehicleTrainingData.mat");
    trainingData = data.vehicleTrainingData;

    Set the value of the dataDir variable as the location where the vehicleTrainingData.mat file is located. Load the test data into a local vehicle data folder.

    dataDir = fullfile(toolboxdir("vision"),"visiondata");
    trainingData.imageFilename = fullfile(dataDir,trainingData.imageFilename);

    Create an imageDatastore using the files from the table.

    imds = imageDatastore(trainingData.imageFilename);

    Create a boxLabelDatastore using the label columns from the table.

    blds = boxLabelDatastore(trainingData(:,2:end));

    Load Pretrained Object Detector

    Load a pretrained YOLO v2 object detector trained to detect vehicles into the workspace.

    vehicleDetector = load("yolov2VehicleDetector.mat");
    detector = vehicleDetector.detector;

    Evaluate and Plot Object Detection Metrics

    Run the detector on the test images. Set the detection threshold to a low value to detect as many objects as possible. This helps you evaluate the detector precision across the full range of recall values.

    results = detect(detector,imds,Threshold=0.01);

    Use evaluateObjectDetection to compute metrics for evaluating the performance of an object detector.

    metrics = evaluateObjectDetection(results,blds);

    Return the precision and recall metrics for the vehicle class using the precisionRecall object function.

    [recall,precision,scores] = precisionRecall(metrics);
    ap = averagePrecision(metrics);

    Plot the precision-recall curve for the vehicle class, the only class in the data set. Compute the average precision (AP) using the averagePrecision object function.

    figure
    plot(recall{1},precision{1})
    grid on
    title("Average Precision = " + ap);
    xlabel("Recall");
    ylabel("Precision");

    Figure contains an axes object. The axes object with title Average Precision = 0.99096, xlabel Recall, ylabel Precision contains an object of type line.

    Compute the summary of the object detection metrics for the data set using the summarize object function.

    [summaryDataset,summaryClass] = summarize(metrics);
    summaryDataset
    summaryDataset=1×3 table
        NumObjects    mAPOverlapAvg    mAP0.5 
        __________    _____________    _______
    
           336           0.99096       0.99096
    
    

    Version History

    Introduced in R2023b

    expand all