Gradient descent on fingers
Implementation of gradient descent
Gradient descent (animation)
Below is a small interactive animation of gradient descent in 1D, 2D and 3D. Tune the learning rate and starting point, press Start, and watch how the parameters move toward the minimum while the loss decreases.
Parameters
We minimize \(f(x) = (x - a)^2\)
Iteration
0
Current x
6.0000
$ f(x)$
16.0000
$|∇f|$
8.0000
Update rule:
$x = x - α · ∇f(x)$
where $∇f(x) = 2(x - a)$
Trajectory
The point moves toward the minimum (the vertical line is $a$).
Loss chart
The chart shows the value of $f$ over iterations. If $α$ is too large, you may see oscillations.