7.6. Expectation#

This section contains several results on expectation operator.

Any function g(x) defines a new random variable g(X). If g(X) has a finite expectation, then

E[g(X)]=g(x)fX(x)dx.

If several random variables X1,,Xn are defined on the same sample space, then their sum X1++Xn is a new random variable. If all of them have finite expectations, then the expectation of their sum exists and is given by

E[X1++Xn]=E[X1]++E[Xn].

If X and Y are mutually independent random variables with finite expectations, then their product is a random variable with finite expectation and

E(XY)=E(X)E(Y).

By induction, if X1,,Xn are mutually independent random variables with finite expectations, then

E[i=1nXi]=i=1nE[Xi].

Let X and Y be two random variables with the joint density function fX,Y(x,y). Let the marginal density function of Y given X be f(y|x). Then the conditional expectation is defined as follows:

E[Y|X]=yf(y|x)dy.

E[Y|X] is a new random variable.

E[E[Y|X]]=E[Y|X]f(x)dx=yf(y|x)f(x)dydx=y(f(x,y)dx)dy=yf(y)dy=E[Y].

In short, we have

E[E[Y|X]]=E[Y].

The covariance of X and Y is defined as

Cov(X,Y)=E[(XE[X])(YE[Y])].

It is easy to see that

Cov(X,Y)=E[XY]E[X]E[Y].

The correlation coefficient is defined as

ρCov(X,Y)Var(X)Var(Y).

7.6.1. Independent Variables#

If X and Y are independent, then

E[g1(x)g2(y)]=E[g1(x)]E[g2(y)].

If X and Y are independent, then Cov(X,Y)=0.

7.6.2. Uncorrelated Variables#

The two variables X and Y are called uncorrelated if Cov(X,Y)=0. Covariance doesn’t imply independence.