In linear algebra, an orthogonal matrix is a square matrix G whose transpose is its inverse, i.e.,
- GGT = GTG = In.
This definition can be given for matrices with entries from any field
, but the most common case is the one of matrices with real
entries, and only that case will be considered in the rest of this article.
A real square matrix is orthogonal if and only if its columns form an orthonormal basis of Rn with the ordinary Euclidean dot product, which is the case if and only if its rows form an orthonormal basis of Rn.
Geometrically, orthogonal matrices describe linear transformations of Rn which preserve angles and lengths, such as rotations and reflections. They are compatible with the Euclidean inner product in the following sense: if G is orthogonal and x and y are vectors in Rn, then
- <Gx, Gy> = <x, y>.
Conversely, if V
is any finite-dimensional real inner product space
is a linear map with
- <f(x), f(y)> = <x, y>
for all elements x
, then f
is described by an orthogonal matrix with respect to any orthonormal basis of V
The inverse of every orthogonal matrix is again orthogonal, as is the matrix product of two orthogonal matrices. This shows that the set of all n-by-n orthogonal matrices forms a group. It is a Lie group
of dimension n(n+1)/2 and is called the orthogonal group, denoted by O(n).
The determinant of any orthogonal matrix is 1 or -1. That can be shown as follows:
- 1 = det(I) = det(GGT) = det(G) det(GT) = (det(G))2.
In three dimesions, the orthogonal matrices with determinant 1 correspond to proper rotations and those with determinant -1 to improper rotations
The set of all orthogonal matrices whose determinant is 1 is a subgroup
) of index
2, the special orthogonal group
All eigenvalues of an orthogonal matrix, even the complex ones, have absolute value 1. Eigenvectors for different eigenvalues are orthogonal.
If Q is orthogonal, then one can always find an orthogonal matrix P such that
where the matrices R1
are 2-by-2 rotation matrices. Intuitively, this result means that every orthogonal matrix describes a combination of rotations and reflections.
The matrices R1
correspond to the non-real eigenvalues of Q
If A is an arbitrary m-by-n matrix of rank n, we can always write
is an orthogonal matrix m
matrix and R
is an upper triangular n
matrix with positive main diagonal entries. This is known as a QR decomposition
and can be proven by
applying the Gram-Schmidt Process
to the columns of A
. It is useful for numerically solving systems of linear equations
and least squares
The complex analog to orthogonal matrices are the unitary matrices.