In applied mathematics and physics the eigenvectors of a matrix or a differential operator often have important physical significance. In classical mechanics the eigenvectors of the governing equations typically correspond to natural modes of vibration in a body, and the eigenvalues to their frequencies. In quantum mechanics, operators correspond to observable variables, eigenvectors are also called **eigenstates**, and the eigenvalues of an operator represent those values of the corresponding variable that have non-zero probability of occurring.

Table of contents |

2 Definition 3 Finding eigenvectors 4 The characteristic polynomial 5 Complex eigenvectors 6 Infinite dimensions 7 External links |

- rotation: no eigenvectors
- reflection: eigenvectors are perpendicular and parallel to the line of symmetry, the eigenvalues are -1 and 1, respectively
- scaling: all vectors are eigenvectors, and the eigenvalue is the scale factor
- projection onto a line: eigenvectors with eigenvalue 1 are parallel to the line, eigenvectors with eigenvalue 0 are parallel to the direction of projection

For example, consider the matrix

An important tool for describing eigenvalues of square matrices is the characteristic polynomial: saying that *c* is an eigenvalue of **A** is equivalent to stating that the system of linear equations (**A** - *c***I**) **x** = **0** (where **I** is the identity matrix) has a non-zero solution **x** (namely an eigenvector), and so it is equivalent to the determinant det(**A** - *c* **I**) being zero. The function *p*(*c*) = det(**A** - *c***I**) is a polynomial in *c* since determinants are defined as sums of products.
This is the *characteristic polynomial* of **A**; its zeros are precisely the eigenvalues of **A**.
If **A** is an *n*-by-*n* matrix, then its characteristic polynomial has degree *n* and **A** can therefore have at most *n* eigenvalues.

Returning to the example above, if we wanted to compute all of **A**'s eigenvalues, we could determine the characteristic polynomial first:

and because we see that the eigenvalues of

(In practice, eigenvalues of large matrices are not computed using the characteristic polynomial. Faster and more numerically stable methods are available, for instance the QR decomposition.)

Note that if **A** is a real matrix, the characteristic polynomial will have real coefficients, but not
all its roots will necessarily be real. The complex eigenvalues will all be associated to complex eigenvectors.

In general, if **v**_{1}, ..., **v**_{m} are eigenvectors to *different* eigenvalues λ_{1}, ..., λ_{m}, then the vectors **v**_{1}, ..., **v**_{m} are necessarily linearly independent.

The spectral theorem for symmetric matrices states that, if **A** is a real symmetric *n*-by-*n* matrix, then all its eigenvalues are real, and there exist *n* linearly independent eigenvectors for **A** which all have length 1 and are mutually orthogonal.

Our example matrix from above is symmetric, and three mutually orthogonal eigenvectors of **A** are

The concept of eigenvectors can be extended to linear operators acting on infinite dimensional Hilbert spaces or Banach spaces. In fact, this is an important topic in Functional analysis. See also: spectrum,spectral theorem