Basic Iterative Methods

For large or highly sparse linear systems, direct solvers may become impractical due to memory or computational constraints. In such cases, iterative methods provide an alternative by generating a sequence of approximate solutions that (ideally) converge to the true solution. This section introduces two of the most fundamental iterative methods: Jacobi and Gauss-Seidel.

Jacobi Method

The Jacobi method is a simple, perfectly parallelizable approach for solving . At each iteration, every component of is updated independently using only the values from the previous iteration. Let , where is the diagonal part of , is the strictly lower triangular part, and is the strictly upper triangular part. The update rule is: or, component-wise:

Remark 34.2.1 (Parallelism). All components can be updated simultaneously in the Jacobi method, making it well-suited for parallel computation.

Gauss-Seidel Method

The Gauss-Seidel method improves upon Jacobi by using the most recently updated values as soon as they are available. This means each new is immediately used in subsequent updates within the same iteration. The update rule is: or, component-wise: A backward Gauss-Seidel iteration can also be defined, where the updates sweep from down to in each iteration: or, component-wise:

Remark 34.2.2 (Sequential Update). The Gauss-Seidel method typically converges faster than Jacobi, as it incorporates the latest information within each iteration. However, the updates are inherently sequential.

Convergence Analysis

All three methods—Jacobi, forward Gauss-Seidel, and backward Gauss-Seidel can be written in the form: where is the iteration matrix. Specifically, for the Jacobi, forward Gauss-Seidel, and backward Gauss-Seidel iterations, respectively. The convergence of these methods depends on the spectral radius (the largest absolute value of the eigenvalues of ). The method converges if and only if .

A sufficient condition for convergence of both methods is that is strictly diagonally dominant (i.e., for all ). For the Gauss-Seidel method, convergence is also guaranteed if is symmetric positive definite (SPD).

Termination Criteria

In practice, iterative methods are terminated when the solution is deemed sufficiently accurate rather than running indefinitely. Common termination criteria include:

  1. Residual-based: Stop when or for a prescribed tolerance .
  2. Solution change: Stop when or indicating convergence.
  3. Maximum iterations: Stop after a predetermined number of iterations to prevent infinite loops.

The relative residual-based criterion is most commonly used as it directly measures how well the current solution satisfies the original linear system, considering the scale of the problem.