fitPosterior
Fit posterior probabilities for compact support vector machine (SVM) classifier
Syntax
Description
returns a trained support vector machine (SVM) classifier
ScoreSVMModel
= fitPosterior(SVMModel
,TBL
,Y
)ScoreSVMModel
containing the optimal
score-to-posterior-probability transformation function for two-class learning. For
more details, see Algorithms. If you train SVMModel
using a table, then you must use a table as input for fitPosterior
.
returns a trained SVM classifier ScoreSVMModel
= fitPosterior(SVMModel
,X
,Y
)ScoreSVMModel
containing the
optimal score-to-posterior-probability transformation function for two-class
learning. If you train SVMModel
using a matrix, then you must
use a matrix as input for fitPosterior
.
[
additionally returns the optimal score-to-posterior-probability transformation
function parameters (ScoreSVMModel
,ScoreTransform
]
= fitPosterior(___)ScoreTransform
) for any of the input
argument combinations in the previous syntaxes.
Examples
Input Arguments
Output Arguments
More About
Tips
This process describes one way to predict positive class posterior probabilities.
Train an SVM classifier by passing the data to
fitcsvm
. The result is a trained SVM classifier, such asSVMModel
, that stores the data. The software sets the score transformation function property (SVMModel.ScoreTransformation
) tonone
.Pass the trained SVM classifier
SVMModel
tofitSVMPosterior
orfitPosterior
. The result, such as,ScoreSVMModel
, is the same trained SVM classifier asSVMModel
, except the software setsScoreSVMModel.ScoreTransformation
to the optimal score transformation function.Pass the predictor data matrix and the trained SVM classifier containing the optimal score transformation function (
ScoreSVMModel
) topredict
. The second column in the second output argument ofpredict
stores the positive class posterior probabilities corresponding to each row of the predictor data matrix.If you skip step 2, then
predict
returns the positive class score rather than the positive class posterior probability.
After fitting posterior probabilities, you can generate C/C++ code that predicts labels for new data. Generating C/C++ code requires MATLAB® Coder™. For details, see Introduction to Code Generation.
Algorithms
The software fits the appropriate score-to-posterior-probability transformation
function by using the SVM classifier SVMModel
and by conducting
10-fold cross-validation using the stored predictor data (SVMModel.X
)
and the class labels (SVMModel.Y
), as outlined in [1]. The transformation function computes the posterior probability that an observation
is classified into the positive class (SVMModel.Classnames(2)
).
If the classes are inseparable, then the transformation function is the sigmoid function.
If the classes are perfectly separable, then the transformation function is the step function.
In two-class learning, if one of the two classes has a relative frequency of 0, then the transformation function is the constant function. The
fitPosterior
function is not appropriate for one-class learning.The software stores the optimal score-to-posterior-probability transformation function in
ScoreSVMModel.ScoreTransform
.
If you re-estimate the score-to-posterior-probability
transformation function, that is, if you pass an SVM classifier to
fitPosterior
or fitSVMPosterior
and its
ScoreTransform
property is not none
, then the software:
Displays a warning
Resets the original transformation function to
'none'
before estimating the new one
Alternative Functionality
You can also fit the optimal score-to-posterior-probability function by using
fitSVMPosterior
. This function is similar
to fitPosterior
, except it is more broad because it accepts a wider
range of SVM classifier types.
References
[1] Platt, J. “Probabilistic outputs for support vector machines and comparisons to regularized likelihood methods.” Advances in Large Margin Classifiers. Cambridge, MA: The MIT Press, 2000, pp. 61–74.
Version History
Introduced in R2014a
See Also
CompactClassificationSVM
| fitcsvm
| fitSVMPosterior
| predict