Predict responses for observations in cross-validated regression model
When you create a cross-validated regression model, you can compute the mean squared error (MSE) by using the
kfoldLoss object function. Alternatively, you can predict responses for validation-fold observations using
kfoldPredict and compute the MSE manually.
carsmall data set. Specify the predictor data
X and the response data
load carsmall X = [Cylinders Displacement Horsepower Weight]; Y = MPG;
Train a cross-validated regression tree model. By default, the software implements 10-fold cross-validation.
rng('default') % For reproducibility CVMdl = fitrtree(X,Y,'CrossVal','on');
Compute the 10-fold cross-validation MSE by using
L = kfoldLoss(CVMdl)
L = 29.4963
Predict the responses
yfit by using the cross-validated regression model. Compute the mean squared error between
yfit and the true responses
CVMdl.Y. The computed MSE matches the loss value returned by
yfit = kfoldPredict(CVMdl); mse = mean((yfit - CVMdl.Y).^2)
mse = 29.4963
CVMdl— Cross-validated partitioned regression model
Cross-validated partitioned regression model, specified as a
RegressionPartitionedSVM object. You can create the in two ways:
Pass a trained regression model listed in the following table to its
crossval object function.
Train a regression model using a function listed in the following table and specify one of the cross-validation name-value arguments for the function.
includeInteractions— Flag to include interaction terms
Flag to include interaction terms of the model, specified as
false. This argument is valid only for a generalized
additive model (GAM). That is, you can specify this argument only when
The default value is
true if the models in
interaction terms. The value must be
false if the models do not
contain interaction terms.
yfit— Predicted responses
Predicted responses, returned as an n-by-1 numeric vector, where
n is the number of observations. (n is
size(CVMdl.X,1) when observations are in rows.) Each entry of
yfit corresponds to the predicted response for the corresponding
If you use a holdout validation technique to create
NaN values for training-fold