Stochastic gradient descent
Stochastic gradient descent (SGD) is a simple optimization algorithm, widely used in ML, often combined with backpropagation. SGD iteratively makes small adjustments to a ML network to decrease the error of the network.
Exey Panteleev | CC BY 2.0