A
matrix
is an orthogonal matrix if
(1)
|
where
is the transpose of
and
is the identity matrix.
In particular, an orthogonal matrix is always invertible, and
(2)
|
In component form,
(3)
|
This relation make orthogonal matrices particularly easy to compute with, since the transpose operation is much simpler than computing an inverse.
For example,
(4)
| |||
(5)
|
are orthogonal matrices.
A matrix
can be tested to see if it is orthogonal in the Wolfram
Language using OrthogonalMatrixQ[m].
The rows of an orthogonal matrix are an orthonormal basis. That is, each row has length one, and are mutually perpendicular. Similarly, the columns are also an orthonormal basis. In fact, given any orthonormal basis, the matrix whose rows are that basis is an orthogonal matrix. It is automatically the case that the columns are another orthonormal basis.
The orthogonal matrices are precisely those matrices which preserve the inner product
(6)
|
Also, the determinant of is either 1 or
. As a subset of
, the orthogonal matrices are not connected
since the determinant is a continuous
function. Instead, there are two components corresponding
to whether the determinant is 1 or
. The orthogonal matrices with
are rotations, and such a matrix is called a special
orthogonal matrix.
The matrix product of two orthogonal matrices is another orthogonal matrix. In addition, the inverse of an orthogonal matrix is an
orthogonal matrix, as is the identity matrix.
Hence the set of orthogonal matrices form a group, called
the orthogonal group .