Definition
Batch Learning
Batch learning (or offline learning) is a paradigm where the model is trained on a static, fixed dataset in its entirety. The parameters are optimised over the complete sample, and the model is typically not updated once deployed.
Comparison with Online Methods
By averaging the gradient or risk over the whole distribution, batch learning provides higher statistical stability compared to online learning. However, it requires significantly more memory and computational resources as the dataset size grows, and adaptation to new data requires a full retraining cycle.