The previous methods (Jacobi, Gauss-Seidel) lack any parameter that can be adjusted to accelerate convergence in difficult cases.
Let us revisit Gauss-Seidel:
Let us rewrite this as
Then SOR is defined as
for some parameter .
SOR: Successive Over-Relaxation
Gauss-Seidel update:
Here is the increment form of this update:
We can now define the SOR update:
SOR relaxation parameter
can be used for different purposes.
- If the method is diverging, we can apply a smaller increment: .
- : when possible, we can try to accelerate convergence by applying a larger increment.
- This method is a little similar to the gradient descent optimization algorithm with momentum, if you are familiar with this approach.
Code for SOR
Ordering
As in Gauss-Seidel, the ordering matters in this algorithm. Here are 3 simple orderings that are commonly used:
- 1 to
- 1 to followed by to 1 in Symmetric SOR
- Or, a different randomized permutation at every iteration