TreeBagger parameter tuning for classification

8 次查看(过去 30 天)
How can I tune parameters for TreeBagger model for classification, I followed the example:"Tune Random Forest Using Quantile Error and Bayesian Optimization", https://fr.mathworks.com/help/stats/tune-random-forest-using-quantile-error-and-bayesian-optimization.html I only changed "regression" with "classification". The following code generated multiple errors:
results = bayesopt(@(params)oobErrRF(params,X),hyperparametersRF,...
'AcquisitionFunctionName','expected-improvement-plus','Verbose',0);
errors:
Error using classreg.learning.internal.table2FitMatrix>resolveName (line 232)
One or more 'ResponseName' parameter values are invalid.
Error in classreg.learning.internal.table2FitMatrix (line 77)
ResponseName = resolveName('ResponseName',ResponseName,FormulaResponseName,false,VarNames);
Error in ClassificationTree.prepareData (line 557)
[X,Y,vrange,wastable,varargin] =
classreg.learning.internal.table2FitMatrix(X,Y,varargin{:},'OrdinalIsCategorical',false);
Error in TreeBagger/init (line 1335)
ClassificationTree.prepareData(x,y,...
Error in TreeBagger (line 615)
bagger = init(bagger,X,Y,makeArgs{:});
Error in oobErrRF2 (line 16)
randomForest = TreeBagger(300,X,'MPG','Method','classification',...
Error in @(params)oobErrRF2(params,trainingDataFeatures)
Error in BayesianOptimization/callObjNormally (line 2184)
Objective = this.ObjectiveFcn(conditionalizeX(this, X));
Error in BayesianOptimization/callObjFcn (line 2145)
= callObjNormally(this, X);
Error in BayesianOptimization/callObjFcn (line 2162)
= callObjFcn(this, X);
Error in BayesianOptimization/performFcnEval (line 2128)
ObjectiveFcnObjectiveEvaluationTime, this] = callObjFcn(this, this.XNext);
Error in BayesianOptimization/run (line 1836)
this = performFcnEval(this);
Error in BayesianOptimization (line 450)
this = run(this);
Error in bayesopt (line 287)
Results = BayesianOptimization(Options);
I would like to know if there is a way to use this method of tuning for classification. If not, how can I tune my parameters for a TreeBagger classifier. Thanks.

回答(1 个)

Don Mathis
Don Mathis 2018-6-8
The following works for me in R2018a. It predicts 'Cylinders' (3 classes) and it calls oobError to get the misclassification rate of the ensemble.
load carsmall
Cylinders = categorical(Cylinders);
Mfg = categorical(cellstr(Mfg));
Model_Year = categorical(Model_Year);
X = table(Acceleration,Cylinders,Displacement,Horsepower,Mfg,...
Model_Year,Weight,MPG);
rng('default'); % For reproducibility
maxMinLS = 20;
minLS = optimizableVariable('minLS',[1,maxMinLS],'Type','integer');
numPTS = optimizableVariable('numPTS',[1,size(X,2)-1],'Type','integer');
hyperparametersRF = [minLS; numPTS];
results = bayesopt(@(params)oobErrRF(params,X),hyperparametersRF,...
'AcquisitionFunctionName','expected-improvement-plus','Verbose',1);
bestOOBErr = results.MinObjective
bestHyperparameters = results.XAtMinObjective
Mdl = TreeBagger(300,X,'Cylinders','Method','classification',...
'MinLeafSize',bestHyperparameters.minLS,...
'NumPredictorstoSample',bestHyperparameters.numPTS);
function oobErr = oobErrRF(params,X)
%oobErrRF Trains random forest and estimates out-of-bag quantile error
% oobErr trains a random forest of 300 regression trees using the
% predictor data in X and the parameter specification in params, and then
% returns the out-of-bag quantile error based on the median. X is a table
% and params is an array of OptimizableVariable objects corresponding to
% the minimum leaf size and number of predictors to sample at each node.
randomForest = TreeBagger(300,X,'Cylinders','Method','classification',...
'OOBPrediction','on','MinLeafSize',params.minLS,...
'NumPredictorstoSample',params.numPTS);
oobErr = oobError(randomForest, 'Mode','ensemble');
end
  9 个评论
Don Mathis
Don Mathis 2018-6-26
>> load networkTraffic.mat
>> proto= categorical(cellstr(proto));
Undefined function or variable 'proto'.
Marta Caneda Portela
What if we need to do kFold validation to optimize hyperparameters?

请先登录,再进行评论。

类别

Help CenterFile Exchange 中查找有关 Classification Ensembles 的更多信息

产品


版本

R2017a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by