Gradient descent

From Calculus
Revision as of 03:53, 26 May 2014 by Vipul (talk | contribs)

Definition

Gradient descent is a general approach used in first-order iterative optimization algorithms whose goal is to find the (approximate) minimum of a function of multiple variables. The idea is that, at each stage of the iteration, we move in the direction of the negative of the gradient vector (or computational approximation to the gradient vector). The step size that we choose depends on the exact algorithm used, and typically involves some sort of line search.

Other names for gradient descent are steepest descent, the method of steepest descent.

The corresponding method, if applied to finding maxima, would move in a positive direction along the gradient vector, and is called gradient ascent.

Types of gradient descent