Main Content

crossval

Cross-validated decision tree

Description

example

cvmodel = crossval(model) creates a partitioned model from model, a fitted classification tree. By default, crossval uses 10-fold cross validation on the training data to create cvmodel.

cvmodel = crossval(model,Name,Value) creates a partitioned model with additional options specified by one or more Name,Value pair arguments.

Examples

collapse all

Create a classification model for the ionosphere data, then create a cross-validation model. Evaluate the quality the model using kfoldLoss.

load ionosphere
tree = fitctree(X,Y);
cvmodel = crossval(tree);
L = kfoldLoss(cvmodel)
L = 0.1083

Input Arguments

collapse all

Classification model, specified as a ClassificationTree object. Use the fitctree function to create a classification tree object.

Name-Value Arguments

Specify optional pairs of arguments as Name1=Value1,...,NameN=ValueN, where Name is the argument name and Value is the corresponding value. Name-value arguments must appear after other arguments, but the order of the pairs does not matter.

Before R2021a, use commas to separate each name and value, and enclose Name in quotes.

Example: cvmodel = crossval(model,'Holdout',0.2)

Cross-validation partition, specified as the comma-separated pair consisting of 'CVPartition' and a cvpartition object created by the cvpartition function. crossval splits the data into subsets with cvpartition.

Use only one of these four options at a time: 'CVPartition', 'Holdout', 'KFold', or 'Leaveout'.

Fraction of the data used for holdout validation, specified as the comma-separated pair consisting of 'Holdout' and a scalar value in the range (0,1).

Use only one of these four options at a time: 'CVPartition', 'Holdout', 'KFold', or 'Leaveout'.

Example: 'Holdout',0.3

Data Types: single | double

Number of folds to use in a cross-validated model, specified as the comma-separated pair consisting of 'KFold' and a positive integer value greater than 1.

Use only one of these four options at a time: 'CVPartition', 'Holdout', 'KFold', or 'Leaveout'.

Example: 'KFold',3

Data Types: single | double

Leave-one-out cross-validation flag, specified as the comma-separated pair consisting of 'Leaveout' and 'on' or 'off'. Leave-one-out is a special case of 'KFold' in which the number of folds equals the number of observations.

Use only one of these four options at a time: 'CVPartition', 'Holdout', 'KFold', or 'Leaveout'.

Example: 'Leaveout','on'

Output Arguments

collapse all

Partitioned model, returned as a ClassificationPartitionedModel object.

Tips

Assess the predictive performance of model on cross-validated data using the “kfold” methods and properties of cvmodel, such as kfoldLoss.

Alternatives

You can create a cross-validation tree directly from the data, instead of creating a decision tree followed by a cross-validation tree. To do so, include one of these five options in fitctree: 'CrossVal', 'KFold', 'Holdout', 'Leaveout', or 'CVPartition'.

Extended Capabilities

Version History

Introduced in R2011a

See Also

|