Classification learner - Confusion matrix
1 次查看(过去 30 天)
显示 更早的评论
I am trying to interpritate the results of confusion matrix (Classification Learner Toolbox) but can not find True Negative rate (TN) and false positive values (FP). I am wondering if FP is related to the 'False Discovery Rates' and 'Positive Predictive Vsalues' could be TN values?
0 个评论
回答(1 个)
Athul Prakash
2020-4-3
Hi EK,
You are correct about False Discovery Rates being the same as False Positives. However, I'm not sure which metric you mean when you say True Negatives - for a multiclass problem, do you mean that every example that is not labelled as class 'A' and is not predicted as class 'A' would count as a TN for class A (and similarly for every class in your dataset)?
You may calculate this metric by subtracting from the total number of examples, all the TPs FPs and FNs to leave you with TNs - since TP+FP+TN+FN = Total No. of examples.
Alternatively, you may export the model trained in the app to your workspace and then get predictions on your test data. After that, you may use 'confusionmat' to obtain the confusion matrix as a matlab array from which you can calculate each of these 4 metrics manually.
Hope it helps!
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Classification Learner App 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!