soft-margin SVM optimization

8 次查看(过去 30 天)
Ece Sureyya Birol
Ece Sureyya Birol 2020-4-21
Hello
I am trying to find the cost function in the unconstrained form of the binary soft-margin SVM optimization problem which is given by g(θ) = f0(θ) + (fj(θ)). The fj function is given by fj(θ) = C*max(0, 1 − yj*θ'* xj ), j = 1, . . . , n, and their sub gradients are given by ∇θ f0 (θ) = 0.5*||w||^2 , and ∇θ fj(θ) = (−C*yj*xj) if yj*θ'* xj < 1 and 0 otherwise.
I cannot implement fj(θ) = C*max(0, 1 − yj*θ'* xj ), j = 1, . . . , n where I don't know how to find the maximum. Is there a built in function for me to find the max?
yj is a vecor of size 105 by 1 which is the y label vector.
xj is a matrix of size 105 by 3 which is the feature vector consisting of training data.
θ is a 3 by 1 vector which takes in the value θ = (w b)' and is the vector of parameters of the soft-margin binary SVM classifier.
C is just a scalar value
if there is any tips and trick you may be able to tell me i would really appriciate it.
Thank you,
AJ

回答(1 个)

Hiro Yoshino
Hiro Yoshino 2020-4-21
Are you using MATLAB or other Open Source Software?
Either way, I bet there is a package for your porpose, i.e., you do not need to implement by yourself. If your eally want, you should hit a proper book. The algorithm is not that complex.
For MATLAB, Check this out for the brief explanation:

类别

Help CenterFile Exchange 中查找有关 Statistics and Machine Learning Toolbox 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by