Gradient descent with constant learning rate

From Calculus
Revision as of 04:26, 26 May 2014 by Vipul (talk | contribs) (Created page with "==Definition== '''Gradient descent with constant learning rate''' is a first-order iterative optimization method and is the most standard and simplest implementation of def...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Definition

Gradient descent with constant learning rate is a first-order iterative optimization method and is the most standard and simplest implementation of gradient descent. In this method, the real number by which the gradient vector is multiplied to determine the step size is constant across iterations. This constant is termed the learning rate and we will customarily denote it as .