Accuracy is very low! When I used a small part of the data set, the accuracy reached 15%, and when I used the whole data set, the accuracy was zero
2 次查看(过去 30 天)
显示 更早的评论
function [result] = multisvm(TrainingSet,Group_Train1,TestSet,Group_Test1)
%Models a given training set with a corresponding group vector and
%classifies a given test set using an SVM classifier according to a
%one vs. all relation.
%
%This code was written by Cody Neuburger cneuburg@fau.edu
%Florida Atlantic University, Florida USA...
%This code was adapted and cleaned from Anand Mishra's multisvm function
%found at http://www.mathworks.com/matlabcentral/fileexchange/33170-multi-class-support-vector-machine/
u=unique(Group_Train1);
numClasses=length(u);
result = categorical.empty();
%build models
models = cell(numClasses,1);
for k=1:numClasses
%Vectorized statement that binarizes Group
%where 1 is the current class and 0 is all other classes
G1vAll=(Group_Train1==u(k));
models{k} = fitcsvm(TrainingSet,G1vAll,'KernelFunction','polynomial','polynomialorder',3,'Solver','ISDA','Verbose',0,'Standardize',true);
if ~models{k}.ConvergenceInfo.Converged
fprintf('Training did not converge for class "%s"\n', string(u(k)));
end
end
%classify test cases
for t=1:size(TestSet,1)
matched = false;
for k = numClasses:-1:1
if(predict(models{k},TestSet(t,: )))
matched = true;
break;
end
end
if matched
result(t,1) = u(k);
% result(t) = u(k);
else
result(t,1) = 'No Match';
%--------------------------------
end
end
Accuracy = mean(Group_Test1==result) * 100;
fprintf('Accuracy = %.2f\n', Accuracy);
fprintf('error rate = %.2f\n ', mean(result ~= Group_Test1 ) * 100);
end
When I used a small part of the data set, the accuracy reached 15%, and when I used the whole data set, the accuracy was zero, How can I modify the options or make the model suitable?
0 个评论
回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Statistics and Machine Learning Toolbox 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!