Sometimes we cannot solve an equation exactly. If we cannot solve an equation exactly we may be able to solve it approximately using iteration.
Example: Solve the equation
\[x^2-3x+1=0\]
We can rearrange this equation.
\[x^2-3x+1=0 \rightarrow x^2=3x-1 \rightarrow x=\sqrt{3x-1}\]
From this we can get the iteration rule
\[x_{n+1}= \sqrt{3x_n-1}\]
If we take
\[x_0=1\]
then \[x_1=\sqrt{3 \times -1}=1.414\]
to 3 decimal places. Continuing in this way, we have the table below.
\[n\] |
\[x_n\] |
\[x_{n+1}=\sqrt{3x_n-1}\] |
0 | 1 | 1.414 |
1 | 1.414 | 1.801 |
2 | 1.801 | 2.098 |
3 | 2.098 | 2.301 |
4 | 2.301 | 2.430 |
5 | 2.430 | 2.508 |
6 | 2.508 | 2.594 |
7 | 2.594 | 2.581 |
8 | 2.581 | 2.597 |
We could go on. The iterates seem to be getting closer together. In fact because the original equation is a quadratic we can solve it exactly.
\[x= \frac{3 \pm \sqrt{(-3)^2-4 \times 1 \times 1}}{2 \times 1} = \frac{3 \pm \sqrt{5}}{2} = 2.618\]
or \[0.382\]
The iteration is converging to the first of this. An iteration can only converge to one solution at a time, and iteration is not guaranteed. The solution that the the iterations converge to may be different if the original equation is rearranged to give a different iteration formula.