Formally describe gradient descent.
Formally describe gradient descent.
Gradient descent is an optimization algorithm that can be used to estimate the local minimum of a function.
Gradient descent is only guaranteed to find the local minimum; it will find a valley, but will not necessarily find the lowest valley. Fortunately, the residual sum of the squares cost function is convex (linear regression).