machine-learning ensemble-learning
Definition
Boosting
Boosting is an iterative ensemble learning technique that aims to reduce the bias of a model by combining multiple “weak” learners into a single “strong” learner. Formally, boosting trains base learners sequentially, where each subsequent learner focuses on correcting the errors (residuals) of its predecessors.
Gradient Boosting
In gradient boosting, each new model is trained to minimise the residual risk of the current ensemble . Given a learning rate , the ensemble is updated as:
where and are the residual errors . This procedure can be viewed as performing gradient descent in the function space.
Line Search: In practical implementations, the fixed learning rate is often replaced by an adaptive step size identified via line search. This involves solving a one-dimensional optimisation problem at each step to find the that minimises the loss of the updated ensemble: .