Goal:

Summary of the different approaches:

  1. Householder/Givens: Find such that . Orthogonal triangularization.
  2. Gram-Schmidt: Find such that . This is preferred when is tall and thin. This is a triangular orthogonalization process.

Regular QR factorization with a square matrix:

QR factorization of a thin matrix . The matrix is square. is thin. has 0 below the diagonal. This is a transformation typically obtained using Householder or Givens transformations.

QR factorization of a thin matrix. is thin and has the same size as . is upper triangular and square. This is a factorization that can be obtained using Gram-Schmidt.

The computational strategy is similar to LU and Cholesky.

We start with the product in outer-form:

The first column is simple. We just need it to be of norm 1. Define

How do we get ?

Consider column of :

Use the fact that the are orthogonal.

We are projecting onto .

Then subtract:

The first column of is now zero. Repeat to get all the other columns.

Summary: is .

Loop: from 1 to

  • , .

Each iteration requires flops. So the total computational cost is flops.

If matrix is , the cost is .

QR factorization, QR using Householder transformations, QR using Givens transformations