objectDetectionMetrics
Description
Use the objectDetectionMetrics
object and its object functions to
evaluate the quality of object detection results.
An objectDetectionMetrics
object stores object detection quality metrics,
such as the average precision (AP) and precision recall, computed per class and per image. To
compute the AP and precision recall metrics, pass the objectDetectionMetrics
object to the averagePrecision
or the precisionRecall
object functions, respectively. To compute the confusion matrix, pass the
objectDetectionMetrics
object to the confusionMatrix
object function. Evaluate the summary of all metrics across all classes and all images in the
data set using the summarize
object
function.
Creation
Create an objectDetectionMetrics
object by using the evaluateObjectDetection
function.
Properties
ClassMetrics
— Metrics per class
table
This property is read-only.
Metrics per class, stored as a table with C rows, where
C is the number of classes in the object detection. If additional
metrics are not specified through the AdditionalMetrics
argument of the evaluateObjectDetection
function, the ClassMetrics
table has five columns, corresponding to these object detection metrics.
NumObjects
— Number of objects in the ground truth data for a class.AP
— Average precision (AP) for each class at each overlap threshold inOverlapThreshold
, stored as a numThresh-by-1 array, where numThresh is the number of overlap thresholds.APOverlapAvg
— AP averaged over all overlap thresholds. Specify the overlap thresholds for a class using thethreshold
argument.Precision
— Precision values, stored as a numThresh-by-(numPredictions+1) matrix, where numPredictions is the number of predicted bounding boxes. Precision is the ratio of the number of true positives (TP) and the total number of predicted positives.Precision = TP / (TP + FP)
FP is the number of false positives. Larger precision scores imply that most detected objects match ground truth objects.
Recall
— Recall values, stored as a numThresh-by-(numPredictions+1) matrix, where numPredictions is the number of predicted boxes. Recall is the ratio of the number of true positives (TP) and the number of groundtruths – the sum of true positives (TP) and false negatives (FN).Recall = TP / (TP + FN)
FN is the number of false negatives. Larger recall scores indicate that more of ground truth objects are detected.
Note
For each overlap threshold (row in the
Recall
matrix), the recall values (columns of theRecall
matrix) are sorted in the order of decreasing confidence score associated with each detection.
For information on optional additional metrics for this table, see the
AdditionalMetrics
argument of the evaluateObjectDetection
function.
ImageMetrics
— Metrics per image
table
This property is read-only.
Metrics per image in the data set, stored as a table with
numImages rows, where numImages is the number of
images in the data set. If additional metrics are not specified through the AdditionalMetrics
argument of the evaluateObjectDetection
function, the ImageMetrics
table has three columns, corresponding to these object detection metrics.
NumObjects
— Number of objects in the ground truth data in each image, stored as a positive integer.mAP
— Mean average precision (mAP), calculated by averaging the average precision (AP) across all classes at each overlap threshold in theOverlapThreshold
property, stored as a numThresh-by-1 numeric vector. numThresh is the number of overlap thresholds. Specify the overlap thresholds for an image using thethreshold
argument.mAPOverlapAvg
— Mean average precision (mAP), calculated by averaging the AP across all classes and all overlap thresholds specified by theOverlapThreshold
property, stored as a numeric scalar.
For information on optional additional metrics for this table, see the
AdditionalMetrics
argument of the evaluateObjectDetection
function.
ClassNames
— Class names
array of strings | cell array of character vectors
Class names of detected objects, stored as an array of strings or a cell array of character vectors.
Example: {"sky"} {"grass"} {"building"}
{"sidewalk"}
OverlapThreshold
— Overlap threshold
numeric scalar | numeric vector
Overlap threshold, stored as a numeric scalar or numeric vector of box overlap threshold values over which the mean average precision is computed. When the intersection over union (IoU) of the pixels in the ground truth bounding box and the predicted bounding box is equal to or greater than the overlap threshold, the detection is considered a match to the ground truth (true positive). The IoU is the number of pixels in the intersection of the bounding boxes divided by the number of pixels in the union of the bounding boxes.
Object Functions
averagePrecision | Evaluate average precision metric of object detection results |
confusionMatrix | Compute confusion matrix of object detection results |
precisionRecall | Get precision recall metrics of object detection results |
summarize | Summarize object detection performance metrics at data set and class level |
metricsByArea | Evaluate detection performance across object size ranges |
Examples
Plot Precision-Recall Curve for Object Detection
Load a table containing images and ground truth bounding box labels. The first column contains the images, and the remaining columns contain the labeled bounding boxes.
data = load("vehicleTrainingData.mat");
trainingData = data.vehicleTrainingData;
Set the value of the dataDir
variable as the location where the vehicleTrainingData.mat
file is located. Load the test data into a local vehicle data folder.
dataDir = fullfile(toolboxdir("vision"),"visiondata"); trainingData.imageFilename = fullfile(dataDir,trainingData.imageFilename);
Create an imageDatastore
using the files from the table.
imds = imageDatastore(trainingData.imageFilename);
Create a boxLabelDatastore
using the label columns from the table.
blds = boxLabelDatastore(trainingData(:,2:end));
Load Pretrained Object Detector
Load a pretrained YOLO v2 object detector trained to detect vehicles into the workspace.
vehicleDetector = load("yolov2VehicleDetector.mat");
detector = vehicleDetector.detector;
Evaluate and Plot Object Detection Metrics
Run the detector on the test images. Set the detection threshold to a low value to detect as many objects as possible. This helps you evaluate the detector precision across the full range of recall values.
results = detect(detector,imds,Threshold=0.01);
Use evaluateObjectDetection
to compute metrics for evaluating the performance of an object detector.
metrics = evaluateObjectDetection(results,blds);
Return the precision and recall metrics for the vehicle class using the precisionRecall
object function.
[recall,precision,scores] = precisionRecall(metrics); ap = averagePrecision(metrics);
Plot the precision-recall curve for the vehicle class, the only class in the data set. Compute the average precision (AP) using the averagePrecision
object function.
figure plot(recall{1},precision{1}) grid on title("Average Precision = " + ap); xlabel("Recall"); ylabel("Precision");
Compute the summary of the object detection metrics for the data set using the summarize
object function.
[summaryDataset,summaryClass] = summarize(metrics); summaryDataset
summaryDataset=1×3 table
NumObjects mAPOverlapAvg mAP0.5
__________ _____________ _______
336 0.99096 0.99096
Version History
Introduced in R2023bR2024b: ConfusionMatrix
, NormalizedConfusionMatrix
, and DatasetMetrics
properties have been removed
The ConfusionMatrix
,
NormalizedConfusionMatrix
, and DatasetMetrics
properties of the objectDetectionMetrics
object have been removed.
To update your code to compute the confusion matrix, replace instances of the
ConfusionMatrix
and NormalizedConfusionMatrix
properties with the confusionMatrix
object function. For an example, see the "Evaluate Detector Errors Using Confusion Matrix"
section of the Multiclass Object Detection Using YOLO v2 Deep Learning example.
To compute the summary of the object detection metrics over the entire data set or over
each class, use the summarize
object function.
To compute precision, recall, and confidence scores for all classes in the data set, or
at specified classes and overlap thresholds, use the precisionRecall
object function.
To compute average precision (AP) for all classes and overlap thresholds in the data
set, or specify the classes and overlap thresholds for which to compute AP, use the
averagePrecision
object function.
R2024b: Table columns in ClassMetrics
and ImageMetrics
properties have been renamed
These table columns of the ClassMetrics
and
ImageMetrics
properties have been renamed.
objectDetectionMetrics property | Renamed Columns |
---|---|
|
|
|
|
MATLAB Command
You clicked a link that corresponds to this MATLAB command:
Run the command by entering it in the MATLAB Command Window. Web browsers do not support MATLAB commands.
Select a Web Site
Choose a web site to get translated content where available and see local events and offers. Based on your location, we recommend that you select: .
You can also select a web site from the following list
How to Get Best Site Performance
Select the China site (in Chinese or English) for best site performance. Other MathWorks country sites are not optimized for visits from your location.
Americas
- América Latina (Español)
- Canada (English)
- United States (English)
Europe
- Belgium (English)
- Denmark (English)
- Deutschland (Deutsch)
- España (Español)
- Finland (English)
- France (Français)
- Ireland (English)
- Italia (Italiano)
- Luxembourg (English)
- Netherlands (English)
- Norway (English)
- Österreich (Deutsch)
- Portugal (English)
- Sweden (English)
- Switzerland
- United Kingdom (English)
Asia Pacific
- Australia (English)
- India (English)
- New Zealand (English)
- 中国
- 日本Japanese (日本語)
- 한국Korean (한국어)