Multivariate Gaussian Distribution
Contents
7.8. Multivariate Gaussian Distribution#
A random vector \(X = [X_1, \dots, X_n]^T\) is called Gaussian random vector if
follows a normal distribution for all \(t = [t_1, \dots, t_n ]^T \in \RR^n\). The components \(X_1, \dots, X_n\) are called jointly Gaussian. It is denoted by \(X \sim \NNN_n (\mu, \Sigma)\) where \(\mu\) is its mean vector and \(\Sigma\) is its covariance matrix.
Let \(X \sim \NNN_n (\mu, \Sigma)\) be a Gaussian random vector. The subscript \(n\) denotes that it takes values over the space \(\RR^n\). We assume that \(\Sigma\) is invertible. Its PDF is given by
Moments:
Let \(Y = A X + b\) where \(A \in \RR^{n \times n}\) is an invertible matrix and \(b \in \RR^n\). Then
\(Y\) is also a Gaussian random vector with the mean vector being \(A \mu + b\) and the covariance matrix being \(A \Sigma A^T\). This essentially is a change in basis in \(\RR^n\).
The CF is given by
7.8.1. Whitening#
Usually we are interested in making the components of \(X\) uncorrelated. This process is known as whitening. We are looking for a linear transformation \(Y = A X + b\) such that the components of \(Y\) are uncorrelated. i.e. we start with
and transform \(Y = A X + b\) such that
where \(I_n\) is the \(n\)-dimensional identity matrix.
7.8.1.1. Whitening by Eigen Value Decomposition#
Let
be the eigen value decomposition of \(\Sigma\) with \(\Lambda\) being a diagonal matrix and \(E\) being an orthonormal basis.
Let
Choose \(B = E \Lambda^{\frac{1}{2}}\) and \(A = B^{-1} = \Lambda^{-\frac{1}{2}} E^T\).
Then
Thus the random vector \(Y = [B^{-1} (X - \mu)\) is a whitened vector of uncorrelated components.
7.8.1.2. Causal Whitening#
We want that the transformation be causal, i.e. \(A\) should be a lower triangular matrix. We start with
Choose \(B = L D^{\frac{1}{2}} \) and \(A = B^{-1} = D^{-\frac{1}{2}} L^{-1}\). Clearly, \(A\) is lower triangular.
The transformation is \(Y = [B^{-1} (X - \mu)\).