- every linear combination
*Y*=*a*_{1}*X*_{1}+ ... +*a*_{n}*X*_{n}is normally distributed; - there is a random vector
**Z**=(*Z*_{1}, ...,*Z*_{m}), whose components are independent standard normal random variables, a vector**μ**= (*μ*_{1}, ...,*μ*_{n}) and an*n*-by-*m*matrix*A*such that**X**=*A***Z**+**μ**. - there is a vector
**μ**and a symmetric, positive semi-definite matrix Γ such that the characteristic function of**X**is

The following is not *quite* equivalent to the conditions above, since it fails to allow for a singular matrix as the variance:

- there is a vector
**μ**=(*μ*_{1}, ...,*μ*_{n}) and a symmetric, positive semidefinite matrix Γ such that**X**has density

The vector **μ** in these conditions is the expected value of *X* and the matrix Γ=*A*^{T}*A* is the covariance matrix of the components *X*_{i}.
It is important to realize that the covariance matrix must be allowed to be singular. That case arises frequently in statistics; for example, in the distribution of the vector of residuals in ordinary linear regression problems.
Note also that the *X*_{i} are in general *not* independent; they can be seen as the result of applying the linear transformation *A* to a collection of independent Gaussian variables **Z**.

Proof?

Multivariate Gaussian density

Recall characteristic function of a random vector.

Recall characterizations of gaussian random variables.

Calculate characteristic function of *Z* in terms of characteristic function of **X**.

Deduce characteristic functional of **X** in terms of mean vector and covariance matrix.