Rates of Convergence

The bisection method converges slowly and the Newton - Raphson algorithm converges quickly when it converges.

Suppose a numerical method producing a sequence of iterations (p-n) that converges to p. The

method has order of convergenceifor equivalently if

Whenthe convergence is linear, and whenit is quadratic. If the error of

an iterative algorithm isat iteration n then, for a linear methods, it remainsat iterationbut for a quadratic method the error becomesThus, higher order methods generally converge more rapidly than lower order ones.

Fixed Point Iteration

Consider a fixed point iterationproducing a sequence of iterations

assuch that(sinceis continuous).

Expandas a Taylor series in powers ofFor somein the interval betweenand

Thus,and (1) so

forfixed point iteration is linearly convergent.

Newton - Raphson

Equation (1) shos that the converge of fixed point iterations is best whenis small. The Newton - Raphson method, which is a fixed point iteration method with the mapping functionachieves this.

Thus, provided(i.e.is a simple root, i.e. of multiplicity one),since, by

definition,

Expandas a Taylor series in powers ofup to second order. For somein the interval betweenand

)

sinceandThus

so the convergence of Newton's method is quadratic, however, in the case when(i.e. if),

Thus

So, ifthe convergence of Newton's methods is not quadratic but linear (i.e. the same

as the general form of fixed point iteration).

Bisection

At each step, the size of the interval that contains the root is halved, sobut the error does not necessarily decrease monotonically. However, if we regard the upper bounds of the errors as an estimate of the error, then bisection is linearly convergent.

Secant

This method requires two steps and is harder to analyse. However, it can be shown in a strict mathematical sense thatgiving an order of convergence of(golden ratio). The convergence is super-linear - better than linear but poorer than quadratic.

Add comment

Security code
Refresh