Error, loss functions, and why they are needed

Case 4. Same accuracy – different log loss

Accuracy answers only one question: how often the model predicts the correct class at a chosen threshold (for example, 0.5). But it does not capture how confident the model is in its predictions.

In this case we will use the same true labels and two models that yield the same class labels at the 0.5 threshold. However, one model outputs probabilities closer to 0 and 1 and achieves a lower log loss, while the other stays closer to 0.5 and gets penalized with a higher log loss.

 
<?php

require_once __DIR__ '/code.php';

$y = [1010];
$modelA = [0.90.20.90.2];
$modelB = [0.60.40.60.4];

echo 
"Log loss A: " logLoss($y$modelA) . PHP_EOL;
echo 
"Log loss B: " logLoss($y$modelB) . PHP_EOL;
Result: Memory: 0.003 Mb Time running: < 0.001 sec.
Log loss A: 0.16425203348602
Log loss B: 0.51082562376599

Conclusion: two models can have the same accuracy but different log loss. Log loss evaluates the quality of the predicted probabilities (calibration) and penalizes confident mistakes much more strongly.