The Mechanics of Backpropagation
Computational Graph & Gradient Flow
$x$ (Input):
1.5
$w$ (Weight):
0.5
$y$ (Target):
0.0
1. Forward Pass
Logit: $z = x \cdot w$
Activation: $a = \sigma(z)$
Loss: $L = \frac{1}{2}(a - y)^2$
2. Backward Pass
$$\frac{\partial L}{\partial a} = (a - y)$$
$$\frac{\partial a}{\partial z} = \sigma(z)(1 - \sigma(z))$$
$$\frac{\partial z}{\partial w} = x$$
$\frac{\partial L}{\partial w} = \frac{\partial L}{\partial a} \cdot \frac{\partial a}{\partial z} \cdot \frac{\partial z}{\partial w}$
Result:
0.0