The simplest method is the Jacobi iteration.

  • We can check that it follows the general framework of splitting methods.
  • Very easy to implement
  • Fast iteration

Code

for k = 1:maxiter
    xnew = zeros(n)
    for i = 1:n
        for j = 1:n
            if j != i
                xnew[i] += A[i,j] * x[j]
            end
        end
        xnew[i] = (b[i] - xnew[i]) / A[i,i]
    end
    x = xnew
end

Convergence of Jacobi

If is strictly row diagonally dominant

then Jacobi converges. We have a similar result if is strictly column diagonally dominant.

  • Jacobi is very easy to implement on parallel computers because is diagonal (or block diagonal).
  • Solution step is parallel, i.e., entries of can be computed concurrently using independent computing cores:

Proof

The iteration matrix in Jacobi is

Assume that is row-diagonally dominant. Then from the definition of the operator norm, we find that

This proves the convergence of Jacobi since

Assume now that is column-diagonally dominant. Then:

But

So

and

There is also a proof that uses the Gershgorin Circle Theorem.

Gershgorin Circle Theorem. For any complex matrix , each eigenvalue of lies within at least one of the Gershgorin discs in the complex plane. The -th Gershgorin disc is centered at with radius Mathematically, this is expressed as:

where represents an eigenvalue of

Convergence proof using Gershgorin’s Theorem. is the iteration matrix in Jacobi. For the Jacobi method to converge, the spectral radius must be less than 1.

Applying Gershgorin’s Theorem. The diagonal elements of are zero, and the off-diagonal elements are The sum of the absolute values of the off-diagonal elements in the -th row is:

If is strictly diagonally dominant, then:

This implies:

Therefore, each Gershgorin disc for is centered at zero with a radius less than 1. Consequently, all eigenvalues of lie within the unit circle.

Since all eigenvalues of have magnitudes less than 1, the spectral radius This ensures that the Jacobi iteration converges for any initial guess

The same proof applies for column diagonally dominant matrices.