I'm trying to test performance of a KNN and a svm (cecoc) classifier for a multiclass problem.
For the knn model I do it by building mdlknn, which is a cross validated model (leave one out) and then using kfoldLoss function
mdlknn = fitcknn(X,labels, 'NumNeighbors', ...
k, 'Distance',@distKNN, 'Leaveout','on',...
'HyperparameterOptimizationOptions','UseParallel');
perf = 1-kfoldLoss(mdlknn);
Is this correct? My doubts regards the cross-bvalidation:
Using option ('Leaveout','on') in fitcknn is the same as not using it and then calling crossval on the trained model?
mdlknnNoCrossVal = fitcknn(X,labels, 'NumNeighbors', ...
k, 'Distance',@distKNN, 'HyperparameterOptimizationOptions','UseParallel');
mdlknn = crossval(mdlknnNoCrossVal, 'leaveout','on')
perf = 1-kfoldLoss(mdlknn);
Next, since in knn classifier I'm using my distance, I would like to do the same with the cecoc. How can I do that?
At this stage, the code I use for building the cecoc doesn't use it. how can I set it?
t = templateSVM('Standardize',true,'KernelFunction','rbf');
mdlsvm = fitcecoc(X,labels, 'Leaveout','on',...
'HyperparameterOptimizationOptions','UseParallel',...
'coding', 'ternarycomplete','Learners',t);
perf = 1-kfoldLoss(mdlsvm);
And again: Using option ('Leaveout','on') in fitcecoc is the same as not using it and then calling crossval on the trained model, right?
THANKS A LOT