Main Content

oobEdge

Out-of-bag classification edge for bagged classification ensemble model

Description

e = oobEdge(ens) returns the classification edge e for the out-of-bag data in the bagged classification ensemble model ens.

example

e = oobEdge(ens,Name=Value) specifies additional options using one or more name-value arguments. For example, you can specify the indices of the weak learners to use for calculating the edge, and the aggregation level for the output.

Examples

collapse all

Load Fisher's iris data set.

load fisheriris

Train an ensemble of 100 bagged classification trees using the entire data set.

Mdl = fitcensemble(meas,species,'Method','Bag');

Estimate the out-of-bag edge.

edge = oobEdge(Mdl)
edge = 
0.8767

Input Arguments

collapse all

Bagged classification ensemble model, specified as a ClassificationBaggedEnsemble model object trained with fitcensemble.

Name-Value Arguments

Specify optional pairs of arguments as Name1=Value1,...,NameN=ValueN, where Name is the argument name and Value is the corresponding value. Name-value arguments must appear after other arguments, but the order of the pairs does not matter.

Before R2021a, use commas to separate each name and value, and enclose Name in quotes.

Example: oobEdge(ens,Learners=[1 2 3 5]) specifies to use the first, second, third, and fifth learners in the ensemble ens.

Indices of the weak learners in the ensemble to use with oobEdge, specified as a vector of positive integers in the range [1:ens.NumTrained]. By default, the function uses all learners.

Example: Learners=[1 2 4]

Data Types: single | double

Aggregation level for the output, specified as "ensemble", "individual", or "cumulative".

ValueDescription
"ensemble"The output is a scalar value, the loss for the entire ensemble.
"individual"The output is a vector with one element per trained learner.
"cumulative"The output is a vector in which element J is obtained by using learners 1:J from the input list of learners.

Example: Mode="individual"

Data Types: char | string

Flag to run in parallel, specified as a numeric or logical 1 (true) or 0 (false). If you specify UseParallel=true, the oobEdge function executes for-loop iterations by using parfor. The loop runs in parallel when you have Parallel Computing Toolbox™.

Example: UseParallel=true

Data Types: logical

More About

collapse all

Edge

The edge is the weighted mean value of the classification margin. The weights are the class probabilities in ens.Prior.

Margin

The classification margin is the difference between the classification score for the true class and maximal classification score for the false classes. margin is a column vector with the same number of rows as in the matrix ens.X.

Out of Bag

Bagging, which stands for “bootstrap aggregation,”, is a type of ensemble learning. To bag a weak learner such as a decision tree on a data set, fitcensemble generates many bootstrap replicas of the data set and grows decision trees on these replicas. fitcensemble obtains each bootstrap replica by randomly selecting N observations out of N with replacement, where N is the data set size. To find the predicted response of a trained ensemble, predict take an average over predictions from individual trees.

Drawing N out of N observations with replacement omits on average 37% (1/e) of observations for each decision tree. These are "out-of-bag" observations. For each observation, oobLoss estimates the out-of-bag prediction by averaging over predictions from all trees in the ensemble for which this observation is out of bag. It then compares the computed prediction against the true response for this observation. It calculates the out-of-bag error by comparing the out-of-bag predicted responses against the true responses for all observations used for training. This out-of-bag average is an unbiased estimator of the true ensemble error.

Extended Capabilities

Version History

Introduced in R2011a

expand all