Main Content

confusionMatrix

Compute confusion matrix of object detection results

Since R2024b

Description

[confMat,confusionClassNames] = confusionMatrix(metrics) computes the confusion matrix for all overlap thresholds in the OverlapThreshold property of metrics, and returns the confusion matrix and the class names. The function considers all detections to compute the confusion matrix.

[confMat,confusionClassNames] = confusionMatrix(metrics,Name=Value) specifies one or more name-value arguments. For example, ScoreThreshold=0.3 specifies to disregard detections with a confidence score value below 0.3 when computing the confusion matrix.

Input Arguments

collapse all

Object detection performance metrics, specified as an objectDetectionMetrics object.

Name-Value Arguments

Specify optional pairs of arguments as Name1=Value1,...,NameN=ValueN, where Name is the argument name and Value is the corresponding value. Name-value arguments must appear after other arguments, but the order of the pairs does not matter.

Example: confusionMatrix(metrics,ScoreThreshold=0.3) specifies detections with a confidence score value below 0.3 to be disregarded when computing the confusion matrix.

Confidence score threshold values to use for computing the confusion matrix, specified as a numeric scalar in the range [0, 1] or a numeric vector of values in the range [0, 1]. The function filters out predictions with confidence scores less than the threshold value when evaluating the confusion matrix.

Increase this value to reduce the number of false positives, at the possible expense of missing some true positives.

Overlap threshold values to use for computing the confusion matrix, specified as a numeric scalar or numeric vector. Each overlap threshold value must be an element of the OverlapThreshold property of the objectDetectionMetrics object metrics. By default, the function returns the confusion matrix computed at all the overlap thresholds specified by the OverlapThreshold property.

Confusion matrix normalization option, specified as a numeric or logical 0 (or false) or 1 (or true). If you specify Normalize as true, the function normalizes the elements of the confusion matrix in the confMat argument by the number of bounding boxes known to belong to each class. For each overlap threshold, each element (i, j) in the normalized confusion matrix is the count of ground truth bounding boxes that belong to class i, but are predicted to belong to class j, divided by the total number of bounding boxes predicted in class j.

Output Arguments

collapse all

Confusion matrix, returned as an M-by-N cell array. M is the number of confidence score thresholds specified by the ScoreThreshold argument, and N is the number of overlap thresholds specified by the OverlapThreshold argument.

Each element of the confMat cell array is a square numeric matrix of size (C+1)-by-(C+1), where C is the number of classes. Each element (i, j) of the matrix is the count of matched bounding boxes predicted to belong to class j, but that have a ground truth annotation class of i.

Row and column C+1 in the confusion matrix correspond to these unmatched conditions:

  • Column C+1 — Undetected objects, or false negative predictions. Each element in this column is the number of ground truth annotations of the corresponding class unmatched with any predicted bounding box.

  • Row C+1 — Incorrect predictions, or false positive predictions. Each element in this row is the number of predicted bounding boxes of the corresponding class unmatched with any ground truth annotation.

This image shows the structure of a sample confusion matrix, displayed as a confusion chart. Each element of the matrix contains the number of predictions that fall into a matched or unmatched category. The sum of values in each class row is the total number of ground truth bounding boxes that belong to the corresponding class. The sum of values in each class column is the total number of predicted bounding boxes that belong to the corresponding class.

Confusion Matrix

This image shows the precision-recall plots for selected classes, at a single overlap threshold, which you can use to determine the optimal detection threshold.

Class names corresponding to the returned confusion matrices, returned as a string array of size (C+1). The elements of the string array correspond to the classes stored in the ClassNames property of the objectDetectionMetrics object metrics and an extra background class. The background class, "background", corresponds to the row and column (C+1) of the confusion matrix, which count the false positives and false negatives.

Version History

Introduced in R2024b