For any square matrix , there exists at least a scalar and such that . They are called an eigenvalue and eigenvector of .
The computational cost of computing all the eigenvalues is . This will be covered in greater detail in future chapters.
Existence of eigenvalues
We prove the existence of at least one eigenvalue and eigenvector with .
Consider the sequence β¦, We have vectors so they are linearly dependent. So there exists β¦, such that
Using the fundamental theorem of algebra we can find and such that
This shows that
If is non-singular for all then
is non-singular. This is a contradiction. So there exists an such that is singular. So is an eigenvalue.
Finding all the eigenvalues
We show that there exists a non-singular matrix and upper triangular matrix such that
We will show that all the eigenvalues of can be found along the diagonal of .
Letβs prove the existence of and .
We use a proof by induction on the matrix size . The result can be verified when
Assume that the result is true for matrices of size less than
We know that there exists and such that Denote by
Since is singular .
Moreover, is stable, that is . To prove this, take . Then:
Since , we have . So is stable.
We can consider the restriction of to the subspace . Using the induction hypothesis, there is a basis β¦, such that restricted to is upper triangular in that basis.
Letβs extend this set of independent vectors to get a full basis of :
Take a and
So . This proves that
where is upper triangular. So .
The eigenvalues of are equal to the eigenvalues of . Moreover, we can prove that is singular if and only if this matrix has a 0 on the diagonal. This implies that all the eigenvalues of are on its diagonal, and there are of them. Note that these eigenvalues might be repeated.