Gradient descent on fingers
Example 4. Batch and stochastic descent
Below is the output of running both variants of gradient descent for the same task $y = 2x$ (model $\hat{y}=wx$). Compare how the parameter $w$ changes across epochs.
Example of use
<?php
// Batch Gradient Descent
$x = [1, 2, 3, 4];
$y = [2, 4, 6, 8];
$w = 0.0;
$lr = 0.1;
$n = count($x);
echo "Batch GD\n";
define('EPOCHS', 10);
for ($epoch = 1; $epoch <= EPOCHS; $epoch++) {
$gradient = 0.0;
for ($i = 0; $i < $n; $i++) {
$gradient += $x[$i] * (($w * $x[$i]) - $y[$i]);
}
$gradient = (2 / $n) * $gradient;
$w -= $lr * $gradient;
echo "Epoch $epoch: w = " . round($w, 4) . "\n";
}
echo "\n";
// Stochastic Gradient Descent
$x = [1, 2, 3, 4];
$y = [2, 4, 6, 8];
$w = 0.0;
$lr = 0.1;
$n = count($x);
echo "Stochastic GD\n";
for ($epoch = 1; $epoch <= EPOCHS; $epoch++) {
for ($i = 0; $i < $n; $i++) {
$gradient = 2 * $x[$i] * (($w * $x[$i]) - $y[$i]);
$w -= $lr * $gradient;
}
echo "Epoch $epoch: w = " . round($w, 4) . "\n";
}
Result:
Memory: 0.001 Mb
Time running: < 0.001 sec.
Batch GD
Epoch 1: w = 3
Epoch 2: w = 1.5
Epoch 3: w = 2.25
Epoch 4: w = 1.875
Epoch 5: w = 2.0625
Epoch 6: w = 1.9688
Epoch 7: w = 2.0156
Epoch 8: w = 1.9922
Epoch 9: w = 2.0039
Epoch 10: w = 1.998
Stochastic GD
Epoch 1: w = 1.4368
Epoch 2: w = 1.8414
Epoch 3: w = 1.9553
Epoch 4: w = 1.9874
Epoch 5: w = 1.9965
Epoch 6: w = 1.999
Epoch 7: w = 1.9997
Epoch 8: w = 1.9999
Epoch 9: w = 2
Epoch 10: w = 2