use the applet to compute five iterations with both the Jacobi and Gauss-Seidel Methods, using initial value x(0) = (-0.5, 0.5), and record your results. For each method, complete a table like the one below, which shows the first set of entries for Jacobi's Method. To better see the results, click the Default Axes button and then Zoom out a few times.
k | x^{(k)} | ||error^{(k)}|| | |
---|---|---|---|
1 | 0.000000 | 3.000000 | 3.162278 |
2 | |||
3 | |||
4 | |||
5 |
and use the applet to compute five iterations, again making a table for each method. This time the approximations produced by the iterations should converge. Find the eigenvalues of the new iteration matrix B for each method, and use this information to explain why the approximations converge in this case. To better see the results, click the
is strictly diagonally dominant, then the eigenvalues of the iteration matrices B corresponding to the Jacobi and Gauss-Seidel Methods are of magnitude < 1 (which guarantees convergence, as you found in Exercise 7).
and verify that the eigenvalues for this matrix are
Compute the errors
Consider solving each using Jacobi’s Method. Since the coefficient matrices A are different, the iteration matrices B in each case are different, and thus the convergence properties of Jacobi's Method in each case are different.
We will consider the convergence rate of Jacobi’s Method for solving
Recall that
k | ||e^{(k)}|| | ||e^{(k)}|| / ||e^{(k-1)}|| |
---|---|---|
0 | 1.414214 | − |
1 | 0.942809 | 0.666666 |
2 | ||
3 | ||
4 |
In general, what would
What conditions on a and b would guarantee convergence (that is, what would guarantee that
Do Exercise 14 with the Gauss-Seidel Method instead of Jacobi’s Method.
ω |
||error^{(10)}|| |
---|---|
1.1 | 0.000664 |
1.2 | |
1.3 | |
1.4 | |
1.5 | |
1.6 | |
1.7 | |
1.8 | |
1.9 |
One could use the quadratic formula to find the two eigenvalues (as functions of ω) that satisfy
We can simplify this last equation to
Journal of Online Mathematics and its Applications