Backpropagation from scratch
Source: The spelled-out intro to neural networks and backpropagation: building micrograd Backpropagation on paper It implements backpropagation for two arithmetic operation (multiplication and addition) which are quite straightforward. Implementation is for this equation. a = Value(2.0, label='a') b = Value(-3.0, label='b') c = Value(10.0, label='c') e = a*b; e.label = 'e' d = e + c; d.label = 'd' f = Value(-2.0, label='f') L = d * f; L.label = 'L' L The most important thing to note here is the gradient accumulation step (shown at the bottom-left)....