resume
Resume training support vector machine (SVM) classifier
Syntax
Description
returns an updated support vector machine (SVM) classifier
UpdatedSVMModel
= resume(SVMModel
,numIter
)UpdatedSVMModel
by training the SVM classifier
SVMModel
for numIter
more iterations. Like
SVMModel
, the updated SVM classifier is a ClassificationSVM
classifier.
resume
continues applying the training options
set when SVMModel
was trained with fitcsvm
.
returns UpdatedSVMModel
= resume(SVMModel
,numIter
,Name,Value
)UpdatedSVMModel
with additional options specified by one
or more name-value pair arguments. For example, you can specify the verbosity
level.
Examples
Resume Training SVM Classifier
Train an SVM classifier and intentionally cause the solver to fail to converge onto a solution. Then resume training the classifier without having to restart the entire learning process.
Load the ionosphere
data set.
load ionosphere rng(1); % For reproducibility
Train an SVM classifier. Specify that the optimization routine uses at most 50 iterations.
SVMModel = fitcsvm(X,Y,'IterationLimit',50);
DidConverge = SVMModel.ConvergenceInfo.Converged
DidConverge = logical
0
Reason = SVMModel.ConvergenceInfo.ReasonForConvergence
Reason = 'NoConvergence'
DidConverge = 0
indicates that the optimization routine did not converge onto a solution. Reason
states the reason why the routine did not converge. Therefore, SVMModel
is a partially trained SVM classifier.
Resume training the SVM classifier for another 1500
iterations.
UpdatedSVMModel = resume(SVMModel,1500); DidConverge = UpdatedSVMModel.ConvergenceInfo.Converged
DidConverge = logical
1
Reason = UpdatedSVMModel.ConvergenceInfo.ReasonForConvergence
Reason = 'DeltaGradient'
DidConverge
indicates that the optimization routine converged onto a solution. Reason
indicates that the gradient difference (DeltaGradient
) reached its tolerance level (DeltaGradientTolerance
). Therefore, SVMModel
is a fully trained SVM classifier.
Monitor Training of SVM Classifier
Train an SVM classifier and intentionally cause the solver to fail to converge onto a solution. Then resume training the classifier without having to restart the entire learning process. Compare values of the resubstitution loss for the partially trained classifier and the fully trained classifier.
Load the ionosphere
data set.
load ionosphere
Train an SVM classifier. Specify that the optimization routine uses at most 100 iterations. Monitor the algorithm specifying that the software prints diagnostic information every 50
iterations.
SVMModel = fitcsvm(X,Y,'IterationLimit',100,'Verbose',1,'NumPrint',50);
|===================================================================================================================================| | Iteration | Set | Set Size | Feasibility | Delta | KKT | Number of | Objective | Constraint | | | | | Gap | Gradient | Violation | Supp. Vec. | | Violation | |===================================================================================================================================| | 0 |active| 351 | 9.971591e-01 | 2.000000e+00 | 1.000000e+00 | 0 | 0.000000e+00 | 0.000000e+00 | | 50 |active| 351 | 8.064425e-01 | 3.736929e+00 | 2.161317e+00 | 60 | -3.628863e+01 | 1.110223e-16 | SVM optimization did not converge to the required tolerance.
The software prints an iterative display to the Command Window. The printout indicates that the optimization routine has not converged onto a solution.
Estimate the resubstitution loss of the partially trained SVM classifier.
partialLoss = resubLoss(SVMModel)
partialLoss = 0.1054
The training sample misclassification error is approximately 12%.
Resume training the classifier for another 1500
iterations. Specify that the software print diagnostic information every 250
iterations.
UpdatedSVMModel = resume(SVMModel,1500,'NumPrint',250)
|===================================================================================================================================| | Iteration | Set | Set Size | Feasibility | Delta | KKT | Number of | Objective | Constraint | | | | | Gap | Gradient | Violation | Supp. Vec. | | Violation | |===================================================================================================================================| | 250 |active| 351 | 1.137406e-01 | 1.688486e+00 | 1.064098e+00 | 100 | -7.654307e+01 | 1.477984e-15 | | 500 |active| 351 | 2.458986e-03 | 8.900780e-02 | 5.353919e-02 | 103 | -7.819650e+01 | 1.570792e-15 | | 750 |active| 351 | 6.861149e-04 | 2.041818e-02 | 1.045385e-02 | 103 | -7.820930e+01 | 1.499668e-15 | | 1000 |active| 351 | 5.992844e-05 | 1.878806e-03 | 1.095583e-03 | 103 | -7.820958e+01 | 1.606354e-15 | | 1072 |active| 351 | 3.992245e-05 | 9.877142e-04 | 5.324559e-04 | 103 | -7.820959e+01 | 1.823194e-15 | Exiting Active Set upon convergence due to DeltaGradient.
UpdatedSVMModel = ClassificationSVM ResponseName: 'Y' CategoricalPredictors: [] ClassNames: {'b' 'g'} ScoreTransform: 'none' NumObservations: 351 Alpha: [103x1 double] Bias: -3.8829 KernelParameters: [1x1 struct] BoxConstraints: [351x1 double] ConvergenceInfo: [1x1 struct] IsSupportVector: [351x1 logical] Solver: 'SMO'
The software resumes at iteration 1000
and uses the same verbosity level as the one set when you trained the model using fitcsvm
. The printout indicates that the algorithm converged. Therefore, UpdatedSVMModel
is a fully trained ClassificationSVM
classifier.
updatedLoss = resubLoss(UpdatedSVMModel)
updatedLoss = 0.0769
The training sample misclassification error of the fully trained classifier is approximately 8%.
Input Arguments
SVMModel
— Full, trained SVM classifier
ClassificationSVM
classifier
Full, trained SVM classifier, specified as a ClassificationSVM
model trained with fitcsvm
.
numIter
— Number of iterations
positive integer
Number of iterations to continue training the SVM classifier, specified as a positive integer.
Data Types: double
Name-Value Arguments
Specify optional pairs of arguments as
Name1=Value1,...,NameN=ValueN
, where Name
is
the argument name and Value
is the corresponding value.
Name-value arguments must appear after other arguments, but the order of the
pairs does not matter.
Before R2021a, use commas to separate each name and value, and enclose
Name
in quotes.
Example: resume(SVMModel,500,'Verbose',2)
trains
SVMModel
for 500
more iterations and
specifies displaying diagnostic messages and saving convergence criteria at every
iteration.
Verbose
— Verbosity level
0
| 1
| 2
Verbosity level, specified as the comma-separated pair consisting of
'Verbose'
and 0
,
1
, or 2
.
Verbose
controls the amount of optimization
information displayed in the Command Window and saved as a structure to
SVMModel.ConvergenceInfo.History
.
This table summarizes the verbosity level values.
Value | Description |
---|---|
0 | The software does not display or save convergence information. |
1 | The software displays diagnostic messages and
saves convergence criteria every
numprint iterations,
where numprint is the
value of the 'NumPrint'
name-value pair argument. |
2 | The software displays diagnostic messages and saves convergence criteria at every iteration. |
By default, Verbose
is the value that fitcsvm
uses to train
SVMModel
.
Example: 'Verbose',1
Data Types: single
NumPrint
— Number of iterations between diagnostic message printouts
nonnegative integer
Number of iterations between diagnostic message printouts, specified
as the comma-separated pair consisting of 'NumPrint'
and a nonnegative integer.
If you set 'Verbose',1
and
'NumPrint',
,
then the software displays all optimization diagnostic messages from SMO
[1] and ISDA [2] every numprint
numprint
iterations in the Command
Window.
By default, NumPrint
is the value that fitcsvm
uses to train
SVMModel
.
Example: 'NumPrint',500
Data Types: single
Tips
If optimization does not converge and the solver is 'SMO'
or
'ISDA'
, then try to resume training the SVM classifier.
References
[1] Fan, R.-E., P.-H. Chen, and C.-J. Lin. “Working set selection using second order information for training support vector machines.” Journal of Machine Learning Research, Vol. 6, 2005, pp. 1889–1918.
[2] Kecman V., T. -M. Huang, and M. Vogt. “Iterative Single Data Algorithm for Training Kernel Machines from Huge Data Sets: Theory and Performance.” Support Vector Machines: Theory and Applications. Edited by Lipo Wang, 255–274. Berlin: Springer-Verlag, 2005.
Extended Capabilities
GPU Arrays
Accelerate code by running on a graphics processing unit (GPU) using Parallel Computing Toolbox™.
This function fully supports GPU arrays. For more information, see Run MATLAB Functions on a GPU (Parallel Computing Toolbox).
Version History
Introduced in R2014a
See Also
MATLAB Command
You clicked a link that corresponds to this MATLAB command:
Run the command by entering it in the MATLAB Command Window. Web browsers do not support MATLAB commands.
Select a Web Site
Choose a web site to get translated content where available and see local events and offers. Based on your location, we recommend that you select: .
You can also select a web site from the following list
How to Get Best Site Performance
Select the China site (in Chinese or English) for best site performance. Other MathWorks country sites are not optimized for visits from your location.
Americas
- América Latina (Español)
- Canada (English)
- United States (English)
Europe
- Belgium (English)
- Denmark (English)
- Deutschland (Deutsch)
- España (Español)
- Finland (English)
- France (Français)
- Ireland (English)
- Italia (Italiano)
- Luxembourg (English)
- Netherlands (English)
- Norway (English)
- Österreich (Deutsch)
- Portugal (English)
- Sweden (English)
- Switzerland
- United Kingdom (English)
Asia Pacific
- Australia (English)
- India (English)
- New Zealand (English)
- 中国
- 日本Japanese (日本語)
- 한국Korean (한국어)