The matrix decomposition of a square matrix into so-called eigenvalues and eigenvectors is an extremely important one. This decomposition generally goes under the name "matrix diagonalization." However, this moniker is less than optimal, since the process being described is really the decomposition of a matrix into a product of three other matrices, only one of which is diagonal, and also because all other standard types of matrix decomposition use the term "decomposition" in their names, e.g., Cholesky decomposition, Hessenberg decomposition, and so on. As a result, the decomposition of a matrix into matrices composed of its eigenvectors and eigenvalues is called eigen decomposition in this work.
Assume has nondegenerate eigenvalues and corresponding linearly independent eigenvectors which can be denoted
(1)
|
Define the matrices composed of eigenvectors
(2)
| |||
(3)
|
and eigenvalues
(4)
|
where is a diagonal matrix. Then
(5)
| |||
(6)
| |||
(7)
| |||
(8)
| |||
(9)
| |||
(10)
|
giving the amazing decomposition of into a similarity transformation involving and ,
(11)
|
The fact that this decomposition is always possible for a square matrix as long as is a square matrix is known in this work as the eigen decomposition theorem.
Furthermore, squaring both sides of equation (11) gives
(12)
| |||
(13)
| |||
(14)
|
By induction, it follows that for general positive integer powers,
(15)
|
The inverse of is
(16)
| |||
(17)
|
where the inverse of the diagonal matrix is trivially given by
(18)
|
Equation (◇) therefore holds for negative as well as positive.
A further remarkable result involving the matrices and follows from the definition of the matrix exponential
(19)
| |||
(20)
| |||
(21)
| |||
(22)
|
This is true since is a diagonal matrix and
(23)
| |||
(24)
| |||
(25)
| |||
(26)
|
so can be found using .