machine-learning ensemble-learning

Definition

Bagging

Bagging (Bootstrap Aggregating) is an ensemble learning technique designed to reduce the variance of a model. Formally, given a dataset of size , bagging generates new datasets by sampling instances from uniformly and with replacement (bootstrapping).

A separate model is trained on each , and the final prediction is aggregated (e.g., majority voting for classification or averaging for regression):

Random Forests

A Random Forest is a specific application of bagging using decision trees as base learners, with the additional constraint that only a random subset of features is considered at each split, further decorrelating the trees.