Newton's method converges quadratically from sufficiently close to a root of multiplicity one
Statement
Suppose is a function of one variable that is at least two times continuously differentiable around a point in its domain, and further, . Then, there exists such that for any , the sequence obtained by applying Newton's method for root-finding for a function of one variable starting from either reaches the root in finitely many steps or has quadratic convergence (or higher-order convergence) to the root .
Case that the second derivative is nonzero
In the case that , we get precisely quadratic convergence from sufficiently close to the root, with convergence rate .
General case of an infinitely differentiable function with finite order of zero
Suppose that, of the integers greater than or equal to 2, is the smallest integer such that . Then, we get convergence of order , with convergence rate:
Corresponding statement for optimization
The preceding statements are in the context of Newton's method for root-finding for a function of one variable. They also imply corresponding statements for Newton's method for optimization of a function of one variable, but all the derivative orders increase by one. Explicitly, for a function , the statements for finding the minimum of correspond to statements for funding the zeros of (i.e., set to easily translate statements back and forth).
Suppose we have that is a point of local minimum. Suppose is at least three times continuously differentiable at , and . Then, there exists a value such that for all , Newton's method for optimization, starting at , converges to , and the convergence is quadratic or of higher order.
Case that the third derivative is nonzero
In the case that , the convergence is precisely quadratic convergence and the convergence rate is:
General case of an infinitely differentiable function with finite order of zero
Suppose that, of the integers greater than or equal to 2, is the smallest integer such that . Then, we get convergence of order , with convergence rate:
Related facts
Proof
Proof for the case that the second derivative is nonzero
Since the function is twice continuously differentiable around , we have a value such that has a partial Taylor expansion around (going up to degree two) for . Explicitly, if , we should be able to write:
where the formally means that:
Note that in case the function is three or more times differentiable, that term is actually .
Suppose , the iterate of Newton's method, is in the interval . Define:
We thus obtain:
Since , this simplifies to:
We also have:
We therefore obtain:
Taylor-expanding the reciprocal of the denominator and simplifying, we obtain:
We have:
Recall that . Similarly, . Thus:
This simplifies to:
Notice that if is sufficiently small, then we are guaranteed that , so that the corresponding statement holds for . In particular, we obtain convergence, and moreover, we obtain that:
Proof for the case that a higher finite order derivative is nonzero
Consider the case that , and of the integers that are at least 2, is the smallest one such that . The proof remains very similar to the above, with the following changes.
We get:
and:
Dividing and Taylor-expanding, we get:
Thus, we get:
Thus, we get:
So that, if we start from sufficiently close, the limit is: