# Eigenvalue

In

linear algebra, a

scalar λ is called an

**eigenvalue** (in some older texts, a

**characteristic value**) of a

linear mapping *A* if there exists a nonzero vector

*x* such that

*Ax*=λ

*x*. The vector

*x* is called an

eigenvector.

In matrix theory, an element in the underlying ring *R* of a square matrix *A* is called a **right eigenvalue** if there exists a nonzero column vector *x* such that *Ax*=λ*x*, or a **left eigenvalue** if the exists a nonzero row vector *y* such that *yA*=*y*λ. If *R* is commutative, the left eigenvalues of *A* are exactly the right eigenvalues of *A* and are just called **eigenvalues**. If *R* is not commutative, e.g. quaternions, they may be different.

Suppose *A* is a square matrix over commutative ring. The **algebraic multiplicity** (or simply **multiplicity**) of an eigenvalue λ of *A* is the number of factor *t*-λ of the characteristic polynomial of

*A*. The

**geometric multiplicity** of λ is the number of factor

*t*-λ of the

minimal polynomial of

*A* or equivalently the

nullity of (λI-

*A*).

An eigenvalue of algebraic multiplicity 1 is called a *simple eigenvalue*.

In functional analysis, a

spectrum of a linear operator

*A* is the set of scalar ν such that νI-

*A* is not invertible. If the underlying

Hilbert space is of finite dimensional, then the spectrum of

*A* is the same of the set of eigenvalues of

*A*.

Occasionally, in an article on matrix theory, one may read a statement like:
- The eigenvalues of a matrix
*A* are 4,4,3,3,3,2,2,1.

It means the algebraic multiplicity of 4 is two, of 3 is three, of 2 is two and of 1 is one.
This style is used because algebraic multiplicity is the key to many mathematical proofs in matrix theory.

Suppose the eigenvalues of a matrix *A* are λ_{1},λ_{2},...,λ_{n}. Then the trace of

*A* is λ

_{1}+λ

_{2}+...+λ

_{n} and the

determinant of

*A* is λ

_{1}λ

_{2}λ

_{n}. These two are very important concepts in matrix theory.

Please refer to eigenvector for some other properties of eigenvalues.\n