Main Page | See live article | Alphabetical index

Multivariate normal distribution

A random vector X=(X1,...,Xn) follows a multivariate normal distribution, also sometimes called a multivariate Gaussian distribution (in honor of Carl Friedrich Gauss, who was not the first to write about the normal distribution), if it satisfies the following equivalent conditions:

φX(u)=exp(iμTu − (½) uT Γ u).

The following is not quite equivalent to the conditions above, since it fails to allow for a singular matrix as the variance:

fX(x1, ..., xn) dx1 ... dxn = (det(2πΓ))−1/2 exp ½((xμ)TΓ−1(xμ)) dx1...dxn.

The vector μ in these conditions is the expected value of X and the matrix Γ=ATA is the covariance matrix of the components Xi. It is important to realize that the covariance matrix must be allowed to be singular. That case arises frequently in statistics; for example, in the distribution of the vector of residuals in ordinary linear regression problems. Note also that the Xi are in general not independent; they can be seen as the result of applying the linear transformation A to a collection of independent Gaussian variables Z.

Proof?

Multivariate Gaussian density

Recall characteristic function of a random vector.

Recall characterizations of gaussian random variables.

Calculate characteristic function of Z in terms of characteristic function of X.

Deduce characteristic functional of X in terms of mean vector and covariance matrix.