# Gradient descent with decaying learning rate

From Calculus

Revision as of 15:10, 1 September 2014 by Vipul (talk | contribs) (Created page with "==Definition== '''Gradient descent with decaying learning rate''' is a form of gradient descent where the learning rate varies as a function of the number of iterations,...")

## Definition

**Gradient descent with decaying learning rate** is a form of gradient descent where the learning rate varies as a function of the number of iterations, but is not otherwise dependent on the value of the vector at the stage. The update rule is as follows:

**Failed to parse (syntax error): \vec{x}^{(k+1)} = \vec{x}^{(k)} - \alpha_k f\left(\vec{x^{(k)}\right)**

where depends only on and not on the choice of .

## Cases

Type of decay | Example expression for | More information |
---|---|---|

linear decay | Gradient descent with linearly decaying learning rate | |

quadratic decay | Gradient descent with quadratically decaying learning rate | |

exponential decay | where | Gradient descent with exponentially decaying learning rate |