machine-learning optimisation calculus
Definition
Gradient Checking
Gradient checking is a debugging technique used to verify the correctness of an analytical gradient implementation (typically computed via backpropagation). It compares the analytical gradient with a numerical approximation derived from the limit definition of the derivative.
Numerical Approximation
The approximation is calculated using the two-sided finite difference formula for a small (e.g., ):
Verification Criterion: The implementation is considered correct if the relative difference between the analytical gradient and the numerical gradient is sufficiently small:
This process ensures that the complex chain of partial derivatives in deep neural architectures is mathematically sound before proceeding with large-scale training.