Anonymous
backpropagation is a step in estimating the parameters of a model numerically using an optimization algorithm such as gradient decsent or adam. The optimization algorithm involves applying chain rule to the model, it then makes a forward pass to calculate the gradient of the data for the calculated gradient, and then during the backpropagation, the optimizer updates the parameters of a model using the calculated gradient, a learning rate and the specific formulation of the optimizer. For Stochasting gradient decsent the delta is simply the learning rate multiplied by the gradient.