Backpropagation is an algorithm to compute the gradient. In effect, the cost function is optimized for each training example, and therefore, each training example has an effect on how weights and biases are adjusted. 

Metrics