In linear algebra, an **orthogonal matrix** is a square matrix `G` whose transpose is its inverse, i.e.,

*GG*^{T}=*G*^{T}*G*= I_{n}.

A real square matrix is orthogonal if and only if its columns form an orthonormal basis of **R**^{n} with the ordinary Euclidean dot product, which is the case if and only if its rows form an orthonormal basis of **R**^{n}.

Geometrically, orthogonal matrices describe linear transformations of **R**^{n} which preserve angles and lengths, such as rotations and reflections. They are compatible with the Euclidean inner product in the following sense: if `G` is orthogonal and **x** and **y** are vectors in **R**^{n}, then

- <
`G`**x**,`G`**y**> = <**x**,**y**>.

- <
*f*(**x**),*f*(**y**)> = <**x**,**y**>

The inverse of every orthogonal matrix is again orthogonal, as is the matrix product of two orthogonal matrices. This shows that the set of all `n`-by-`n` orthogonal matrices forms a group. It is a Lie group
of dimension `n`(`n`+1)/2 and is called the orthogonal group, denoted by O(`n`).

The determinant of any orthogonal matrix is 1 or -1. That can be shown as follows:

- 1 = det(I) = det(
`G``G`^{T}) = det(`G`) det(`G`^{T}) = (det(`G`))^{2}.

All eigenvalues of an orthogonal matrix, even the complex ones, have absolute value 1. Eigenvectors for different eigenvalues are orthogonal.

If *Q* is orthogonal, then one can always find an orthogonal matrix *P* such that

If *A* is an arbitrary *m*-by-*n* matrix of rank *n*, we can always write

The complex analog to orthogonal matrices are the unitary matrices.