Why SVM is not giving expected result

1 次查看(过去 30 天)
Diver
Diver 2015-10-21
评论: Diver 2015-10-23
0
I have training data composed from only one feature.
  • The feature have around 113K observation.
  • 8K only of those observation have positive class.
  • 105K of those observation have negative class.
  • The 8K observation composed of a number below 1 (90%), and 10% above 1
  • The 105K observation composed of a number above 1 (80%), and 20% below 1
Hence, almost, any X value below than 1 show be predicted as positive class, and any X value above 1 should be predicted as negative class.
I used the following fitcsvm call:
svmStruct = fitcsvm(X,Y,'Standardize',true, 'Prior','uniform','KernelFunction','linear','KernelScale','auto','Verbose',1,'IterationLimit',1000000);
the fitcsvm give message at the end saying SVM optimization did not converge to the required tolerance., ... but why ... most of first class X values are below 1 and visa versa ... so it should be easy to find classification boundary. and when I run:
[label,score,cost]= predict(svmStruct, X) ;
it gives wrong prediction.
Below a portion of my X values is listed:
0.9911
0.9836
0.9341
0.9751
0.9880
0.9977
0.9853
0.9861
1.0143
1.0086
0.9594
0.9787
0.9927
0.9839
1.0024
0.9931
0.9930
1.0275
  4 个评论
Image Analyst
Image Analyst 2015-10-21
Can you help us by sending your classes to gscatter() and showing us a screenshot?
Diver
Diver 2015-10-23
my understanding gscatter() require X and Y ( 2 features) in addition to group, however, in my case I only have 1 feature, in addition to group, hence, I dont think I use gscatter to plot 1 feature.

请先登录,再进行评论。

回答(0 个)

类别

Help CenterFile Exchange 中查找有关 Statistics and Machine Learning Toolbox 的更多信息

标签

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by