SVMTRAIN - LEAST SQUARE METHOD OR QUADRATIC PROGRAMMING METHOD
2 次查看(过去 30 天)
显示 更早的评论
Hi,
I am using svmtrain function. QP method is really slow (too much) and sometimes it crashes. When I use LS (least square) method it is SUPER fast.
Could you explain me what is the difference btw these two methods. In theory do they have to give the same result ? why one is much faster than the other method ?
Thank you for your help.
0 个评论
采纳的回答
Ilya
2013-2-25
LS minimizes mean squared error to solve the SVM regression problem. In SVM classification, coefficients alpha are bounded between 0 and C (the box constraint). Support vectors (observations with positive alphas) obtained by solving the classification SVM problem are a fraction of the training set. For minimization of MSE, coefficients alpha are not bounded, and most usually every observation in the training set gets a non-zero alpha coefficient.
QP solves the classification problem but is slow on large datasets. For large-scale classification problems, your best bet is SMO.
0 个评论
更多回答(1 个)
Shashank Prasanna
2013-2-25
QP solves the quadratic optimization using the Hessian. In a quick sentence, the advantage is that it is higher in precision and the disadvantage is that it doesn't scale for large problems (large training data).
LS just solves a linear system using the \ operator for a least square fit. There may be precision difference between the two but both are good and depends on the nature of the problem.
Take a look at the following page in the documentation:
Also feel free to take a look at the code by:
>> edit svmtrain
0 个评论
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Quadratic Programming and Cone Programming 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!