The bisection method converges slowly and the Newton - Raphson algorithm converges quickly when it converges.
Suppose a numerical method producing a sequence of iterations (p-n) that converges to p. The
method has order of convergence
if
or equivalently if
![]()
When
the convergence is linear, and when
it is quadratic. If the error of
an iterative algorithm is
at iteration n then, for a linear methods, it remains
at iteration
but for a quadratic method the error becomes
Thus, higher order methods generally converge more rapidly than lower order ones.
Fixed Point Iteration
Consider a fixed point iteration
producing a sequence of iterations![]()
as
such that
(since
is continuous).
Expand
as a Taylor series in powers of
For some
in the interval between
and![]()
![]()
Thus,
and
(1) so
for
fixed point iteration is linearly convergent.
Newton - Raphson
Equation (1) shos that the converge of fixed point iterations is best when
is small. The Newton - Raphson method, which is a fixed point iteration method with the mapping function
achieves this.
![]()
Thus, provided
(i.e.
is a simple root, i.e. of multiplicity one),
since, by
definition,![]()
Expand
as a Taylor series in powers of
up to second order. For some
in the interval between
and![]()
)
since
and
Thus![]()
so the convergence of Newton's method is quadratic, however, in the case when
(i.e. if
),![]()
Thus![]()
So, if
the convergence of Newton's methods is not quadratic but linear (i.e. the same
as the general form of fixed point iteration).
Bisection
At each step, the size of the interval that contains the root is halved, so
but the error does not necessarily decrease monotonically. However, if we regard the upper bounds of the errors as an estimate of the error, then bisection is linearly convergent.
Secant
This method requires two steps and is harder to analyse. However, it can be shown in a strict mathematical sense that
giving an order of convergence of
(golden ratio). The convergence is super-linear - better than linear but poorer than quadratic.