Definition
Learning Rate
The learning rate is a tuning hyperparameter in an optimisation algorithm that determines the step size at each iteration while moving toward a minimum of a loss function. It represents the speed at which a machine learning model “learns” or updates its weights.
Gradient Descent
In gradient descent, the learning rate () scales the magnitude of the gradient used to update the model parameters:
A learning rate that is too high may cause the model to overshoot the minimum or diverge, while a rate that is too low results in slow convergence and potential entrapment in local minima or saddle points.