Definition
L2 Regularisation
L2 regularisation (or Ridge regression) is a regularisation technique that adds a penalty term proportional to the squared norm of the parameter vector to the loss function. Formally, the regularised loss is:
where are the model weights and is the regularisation parameter.
Weight Shrinkage
This technique (often termed weight decay in deep learning) penalises large weight magnitudes more heavily than L1 regularisation. This results in a model where the weights are distributed across features rather than concentrated on a few, leading to more stable predictions and improved generalisation by preventing the model from becoming overly sensitive to any single input dimension.