Newton's method converges linearly from sufficiently close to a root of finite multiplicity greater than one
From Calculus
Statement
Suppose is a function of one variable that is at least one time continuously differentiable at a root
. Further, suppose
, so that
is root of multiplicity greater than 1. Then, there exists
such that for any
, the sequence obtained by applying Newton's method either reaches the root in finitely many steps or has linear convergence to the root
. The convergence rate is
where
is the multiplicity of the root
(i.e., the order of
as a zero).
Related facts
Proof
Based on the information about the order of the root being
, there exists
and constants
such that: