Precision Recall Plot given the ground truth, predicted label, and predicted score

1 次查看(过去 30 天)
How can i get the precision recall plot for this ? I know of the function at http://www.mathworks.com/help/stats/perfcurve.html and http://www.mathworks.com/matlabcentral/fileexchange/21528-precision-recall-and-roc-curves but the issue is that the inputs are the true class labels and the predicted scores.
For example. (I have edited my question. This is my actual real example. All detections positive classes.)
true_labels = [ 0 1 0 0 1 1 ]
predicted_labels = [ 1 1 1 1 1 1 ]
predicted_scores = [ 10 9 8 7 6 5 ] (scores for corresponding label)
If I set threshold at 6, then I get 3 false positives and 2 true positives.
true_labels = [ 0 1 0 0 1 1 ]
predicted_labels = [ 1 1 1 1 1 0 ]
If I set threshold at 8, then I get 2 false positives and 1 true positives.
true_labels = [ 0 1 0 0 1 1 ]
predicted_labels = [ 1 1 1 0 0 0 ]
  3 个评论
RuiQi
RuiQi 2016-7-8
编辑:RuiQi 2016-7-8
Yes they are a measure of certainty and that they happen to be arranged in descending order.
And aren't the precision and recall plots based on the scores ? A higher threshold would lead to lower false positives but at the same time lower true positives. So the precision-recall plot indirectly shows the performance of the detector at varied thresholds.

请先登录,再进行评论。

回答(0 个)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by