SVM and KNN hyperparameter
9 次查看(过去 30 天)
显示 更早的评论
I am attempting to optimize KNN and SVM classifiers with any optimization algorithm except Naivebayes.
can anyone help, please?
0 个评论
回答(1 个)
Alan Weiss
2022-7-1
Is this what you are looking for?
Alan Weiss
MATLAB mathematical toolbox documentation
2 个评论
Alan Weiss
2022-7-4
To use a different algorithm you would have to run a different solver. I am not at all sure of the benefit of using a different algorithm.And I am not familiar with the Bath algorithm. Really, what do you expect to get that is better?
To use ga (genetic algorithm), you need a Global Optimization Toolbox license. To minimize the cross-validation error, you might want to fix the partition, and then use ga to minimize the error over several parameter settings. Something like this:
rng default
c = cvpartition(n,'Kfold',5); % Fix a partition
% I assume that you want to optimize over x(1)=BoxConstraint and x(2)=KernelScale
lb = [1/10,1/10]; % Somewhat arbitrary bounds
ub = [10,10];
% Minimize the cross-validation loss
fun = @(x)kfoldloss(fitcsvm(X,Y,'CVPartition',c,'BoxConstraint',x(1),'KernelScale',x(2)));
[sol,fval] = ga(fun,2,[],[],[],[],lb,ub);
% Now train the model on the optimal parameters
model = fitcsvm(X,Y,'BoxConstraint',sol(1),'KernelScale',sol(2));
But again, before you do this, I believe you should think about what you expect to get that is better than the automatic hyperparameter optimization using Bayesian optimization.
Alan Weiss
MATLAB mathematical toolbox documentation
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Traveling Salesman (TSP) 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!