Linear Transformations

ReLU Activation

Linear transformations alone cannot solve complex, nonlinear problems. Activation functions like ReLU or Sigmoid introduce nonlinearity to the network.
The ReLU function is defined as: $ReLU(x) = max(0, x)$.

 
<?php

// Example usage:
$weightMatrix = [[-12], [1, -2]];  // Weight matrix W
$inputVector = [53];             // Input vector x
$bias = [-102];                  // Bias vector b

// Example usage with values that will produce both positive and negative results
$transform = new LinearTransformation($weightMatrix);

// Apply linear transformation with bias
$linearResult $transform->linearLayer($inputVector$bias);

// Apply ReLU activation
$activated $transform->relu($linearResult);

echo 
"Original values: [<span id='output-vector'>" implode(", "$linearResult) . "</span>]\n";
echo 
"ReLU Output: [<span id='relu-vector'>" implode(", "$activated) . "</span>]";

Chart:


Weight Matrix ($W$)
Input Vector ($x$)
Bias Vector ($b$)
Output Vector ($y = Wx + b$)
-9
1

Result: Memory: 0.002 Mb Time running: < 0.001 sec.
Original values: [-9, 1]
ReLU Output: [0, 1]