Gradient descent with constant learning rate
Definition
Gradient descent with constant learning rate is a first-order iterative optimization method and is the most standard and simplest implementation of gradient descent. In this method, the real number by which the gradient vector is multiplied to determine the step size is constant across iterations. This constant is termed the learning rate and we will customarily denote it as .