7.8. Multivariate Gaussian Distribution#

Definition 7.32

A random vector X=[X1,,Xn]T is called Gaussian random vector if

t,X=XTt=i=1ntiXi=t1X1++tnXn

follows a normal distribution for all t=[t1,,tn]TRn. The components X1,,Xn are called jointly Gaussian. It is denoted by XNn(μ,Σ) where μ is its mean vector and Σ is its covariance matrix.

Let XNn(μ,Σ) be a Gaussian random vector. The subscript n denotes that it takes values over the space Rn. We assume that Σ is invertible. Its PDF is given by

fX(x)=1(2π)n/2det(Σ)1/2exp{12(xμ)TΣ1(xμ)}.

Moments:

E[X]=μRn.
E[XXT]=Σ+μμT.
Cov[X]=E[XXT]E[X]E[X]T=Σ.

Let Y=AX+b where ARn×n is an invertible matrix and bRn. Then

YNn(Aμ+b,AΣAT).

Y is also a Gaussian random vector with the mean vector being Aμ+b and the covariance matrix being AΣAT. This essentially is a change in basis in Rn.

The CF is given by

ΨX(jω)exp(jωTx12ωTΣω),ωRn.

7.8.1. Whitening#

Usually we are interested in making the components of X uncorrelated. This process is known as whitening. We are looking for a linear transformation Y=AX+b such that the components of Y are uncorrelated. i.e. we start with

XNn(μ,Σ)

and transform Y=AX+b such that

YNn(0,In)

where In is the n-dimensional identity matrix.

7.8.1.1. Whitening by Eigen Value Decomposition#

Let

Σ=EΛET

be the eigen value decomposition of Σ with Λ being a diagonal matrix and E being an orthonormal basis.

Let

Λ12=diag(λ112,,λn12).

Choose B=EΛ12 and A=B1=Λ12ET.
Then

Cov(B1X)=Cov(AX)=Λ12ETΣEΛ12=I.
E[B1X]=B1μE[B1(Xμ)]=0.

Thus the random vector Y=[B1(Xμ) is a whitened vector of uncorrelated components.

7.8.1.2. Causal Whitening#

We want that the transformation be causal, i.e. A should be a lower triangular matrix. We start with

Σ=LDLT=(LD12)(D12LT).

Choose B=LD12 and A=B1=D12L1. Clearly, A is lower triangular.

The transformation is Y=[B1(Xμ).