On this page and the next, we attempt to answer two questions regarding the Jacobi and Gauss-Seidel Methods :
*A***x** = **b** takes the form

- When will each of these methods work? That is, under what conditions will they produce a sequence of approximations
that converges to the true solution*x*^{(0)},*x*^{(1)},*x*^{(2)}, …?*x* - When they do work, how quickly will the approximations approach the true solution? That is, what will the rate of convergence be?

so that

which we can rewrite as

where

*B* is often referred to as the iteration matrix.

If the current approximation *x*^{(k)} is, in fact, the exact solution ** x**, then the iterative method should certainly produce a next iteration

Of course, since the problem we are trying to solve is *A x = b*,

On the other hand, choosing *A* = *M* − *N**x*^{(0)}, *x*^{(1)}, *x*^{(2)}, …** x** . Whether a particular method will work depends on the iteration matrix

To understand the convergence properties of an iterative method

we subtract it from the equation

which gives us

That is, since the current error is *e*^{(k)} = ** x** −

To be clear, the superscript of *B* is the *power* of *B*, while the superscript of vector ** e** (inside parentheses) is the

Full appreciation of the significance of the relationship *e*^{(k)} = *B*^{k}*e*^{(0)}*B*||** v**, the norm of the matrix tells us how much bigger or smaller (in norm)

Consequently, since *e*^{(k)}|| = ||*B*^{k} *e*^{(0)}|| ≤ ||*B*||^{k} || *e*^{(0)}||,*e*^{(k)}|| 0*e*^{(k)} ** 0**)

We end this section by noting that one condition sometimes encountered in practice that will guarantee that *B*|| < 1*A* is strictly diagonally dominant. This means that, for each row of *A*, the absolute value of the *diagonal* element is larger than the sum of the absolute values of the *off*-diagonal elements.

Journal of Online Mathematics and its Applications