Theory 1

Expectation for a function on two variables

Discrete case:

E[g(X,Y)]=k,g(k,)PX,Y(k,)(sum over possible values)

Continuous case:

E[g(X,Y)]=++g(x,y)fX,Y(x,y)dxdy

These formulas are not trivial to prove, and we omit the proofs. (Recall the technical nature of the proof we gave for E[g(X)] in the discrete case.)

Expectation sum rule

Suppose X and Y are any two random variables on the same probability model.

Then:

E[X+Y]=E[X]+E[Y]

We already know that expectation is linear in a single variable: E[aX+b]=aE[X]+b.

Therefore this two-variable formula implies:

E[aX+bY+c]=aE[X]+bE[Y]+c

Expectation product rule: independence

Suppose that X and Y are independent.

Then we have:

E[XY]=E[X]E[Y]

Extra - Proof: Expectation sum rule, continuous case

Suppose fX and fY give marginal PDFs for X and Y, and fX,Y gives their joint PDF.

Then:

E[X+Y]++(x+y)fX,Y(x,y)dxdy++xfX,Ydxdy+++yfX,Ydxdy+x(+fX,Ydy)dx++y(+fX,Ydx)dy+xfX(x)dx++yfY(y)dyE[X]+E[Y]

Observe that this calculation relies on the formula for E[g(X,Y)], specifically with g(x,y)=x+y.

Extra - Proof: Expectation product rule

E[XY]++(xy)fX,Y(x,y)dxdy++(xy)fX(x)fY(y)dxdy(independence)+xfX(x)dx+yfY(y)dyE[X]E[Y]