What Is AdaBoost?
AdaBoost (adaptive boosting) is an ensemble learning algorithm that can be used for classification or regression. Although AdaBoost is more resistant to overfitting than many machine learning algorithms, it is often sensitive to noisy data and outliers.
AdaBoost is called adaptive because it uses multiple iterations to generate a single composite strong learner. AdaBoost creates the strong learner (a classifier that is well-correlated to the true classifier) by iteratively adding weak learners (a classifier that is only slightly correlated to the true classifier). During each round of training, a new weak learner is added to the ensemble and a weighting vector is adjusted to focus on examples that were misclassified in previous rounds. The result is a classifier that has higher accuracy than the weak learners’ classifiers.
Adaptive boosting includes the following algorithms:
- AdaBoost.M1 and AdaBoost.M2 – original algorithms for binary and multiclass classification
- LogitBoost – binary classification (for poorly separable classes)
- Gentle AdaBoost or GentleBoost – binary classification (for use with multilevel categorical predictors)
- RobustBoost – binary classification (robust against label noise)
- LSBoost – least squares boosting (for regression ensembles)
- LPBoost – multiclass classification using linear programming boosting
- RUSBoost – multiclass classification for skewed or imbalanced data
- TotalBoost – multiclass classification more robust than LPBoost
For more information on adaptive boosting, see Statistics and Machine Learning Toolbox™.
Examples and How To
Software Reference
See also: machine learning, support vector machine