Review: True/False

TRUE or FALSE:

(a) Suppose Cov[X,Y]=0.05. It is possible that ρX,Y=0.05.

(b) Suppose Cov[X,Y]=0.05. It is possible that X and Y have a strong linear relationship.

(c) Suppose Cov[X,Y]=0. It is possible that X and Y are not independent.

(d) Suppose X and Y are not independent. It is possible that Cov[X,Y] is equal to 0.

(e) Suppose X and Y are independent. Cov[X,Y] must be equal to 0.

Review: Conditional probability

Review

Recall some items related to conditional probability.

Conditioning definition:

P[|A]=P[A]P[A]

Multiplication rule:

P[AB]=P[B|A]P[A]

Division into Cases / Total Probability:

P[B]=P[B|A1]P[A1]++P[B|An]P[An]Link to original

Conditional distribution

01 Theory

Theory 1

Conditional distribution - fixed event

Suppose X is a random variable, and suppose A. The distribution of X conditioned on A describes the probabilities of values of X given knowledge that XA.

Discrete case:

PX|A(k)={1P[A]PX(k)kA0kA

Continuous case:

fX|A(x)={1P[A]fX(x)xA0xA

There is also a conditional CDF, of which this conditional PDF is the derivative:

FX|A(x)=P[Xx|A],fX|A(x)=ddxFX|A(x)

The Law of Total Probability has versions for distributions:

PX(k)=PX|A1(k)P[A1]++PX|An(k)P[An]fX(x)=fX|A1(x)P[A1]++fX|An(x)P[An]

Conditional distribution - variable event

Suppose X and Y are any two random variables. The distribution of X conditioned on Y describes the probabilities of values of X in terms of y, given knowledge that Y=y.

Discrete case:

PX|Y(k|)=P[X=k|Y=]=PX,Y(k,)PY()(assuming PY()0)

Continuous case:

fX|Y(x|y)=fX,Y(x,y)fY(y)(assuming fY(y)0)

Remember: PX,Y(k,) is the probability that “X=k and Y=.”

Sometimes it is useful to have the formulas rewritten like this:

PX,Y(k,)=PX|Y(k|)PY()fX,Y(x,y)=fX|Y(x|y)fY(y)

Extra - Deriving fX|Y(x|y)

The density fX|Y ought to be such that fX|Y(x|y)dx gives the probability of X[x,x+dx], given knowledge that Y[y,y+dy]. Calculate this probability:

P[xXx+dx|yYy+dy]P[xXx+dx,yYy+dy]P[yYy+dy]fX,Y(x,y)dxdyfY(y)dyfX,Y(x,y)fY(y)dx
Link to original

02 Illustration

Example - Conditional PMF, variable event, via joint density

Conditional PMF, variable event, via joint density

Suppose X and Y have joint PMF given by:

PX,Y(k,)={k+21k=1,2,3;=1,20otherwise

Find PX|Y(k|) and PY|X(,k).

Solution

Marginal PMFs:

PX(k)=2k+321,k=1,2,3 PY()=+27,=1,2

Assuming =1 or 2, for each k=1,2,3 we have:

PX|Y(k|)=PX,Y(k,)PY()k+3+6

Assuming k=1, 2, or 3, for each =1,2 we have:

PY|X(|k)=PY,X(,k)PX(k)k+2k+3 Link to original

Conditional expectation

03 Theory

Theory 1

Expectation conditioned by a fixed event

Suppose X is a random variable and A. The expectation of X conditioned on A describes the typical value of X given the hypothesis that XA is known.

Discrete case:

E[X|A]=kkPX|A(k)E[g(X)|A]=kg(k)PX|A(k)

Continuous case:

E[X|A]=+xfX|A(x)dxE[g(X)|A]=+g(x)fX|A(x)dx

Conditional variance:

Var[X|A]=E[(XμX|A)2|A]=E[X2|A]μX|A2

Division into Cases / Total Probability applied to expectation:

E[X]=E[X|A1]P[A1]++E[X|An]P[An]

Linearity of conditional expectation:

E[aX1+bX2+c|Y=y]=aE[X1|Y=y]+bE[X2|Y=y]+c

Extra - Proof: Division of Expectation into Cases

We prove the discrete case only.

  1. Expectation formula:
E[X]=kkPX(k)
  1. Division into Cases for the PMF:
PX(k)=i=1nPX|Ai(k)P[Ai]
  1. Substitute in the formula for E[X]:
kkPX(k)kki=1nPX|Ai(k)P[Ai]i=1nP[Ai]kkPX|Ai(k)i=1nP[Ai]E[X|Ai]

Expectation conditioned by a variable event

Suppose X and Y are any two random variables. The expectation of X conditioned on Y=y describes the typical of value of X in terms of y, given the hypothesis that Y=y is known.

Discrete case:

E[X|Y=y]=kkPX|Y(k|y)(k over all poss. vals.)E[g(X,Y)|Y=y]=kg(k,y)PX|Y(k|y)

Continuous case:

E[X|Y=y]=+xfX|Y(x|y)dxE[g(X,Y)|Y=y]=+g(x,y)fX|Y(x|y)dx
Link to original

05 Illustration

Example - Conditional PMF, fixed event, expectation

Conditional PMF, fixed event, expectation

Suppose X measures the lengths of some items and has the following PMF:

PX(k)={0.15k=1,2,3,40.1k=5,6,7,80 otherwise 

Let L=X5, an event.

(a) Find the conditional PMF of X given that L is known.

(b) Find the conditional expected value and variance of X given L.

Solution

(a)

Conditional PMF formula with kL plugged in:

PX|L(k)={PX(k)P[L]k=5,6,7,80 otherwise 

Compute P[L] by adding cases:

P[L]=k=58PX(k)0.4

Divide nonzero PMF entries by 0.1:

PX|L(k)={0.25k=5,6,7,80 otherwise 

(b)

Find E[X|L]:

E[X|L]=k=58kPX|L(k)5(0.25)+6(0.25)+7(0.25)+8(0.25)6.5min

Find E[X2|L]:

E[X2|L]=k=58k2PX|L(k)52(0.25)+62(0.25)+72(0.25)+82(0.25)43.5min2

Find Var[X|L] using “short form” with conditioning:

Var[X|L]=E[X2|L]E[X|L]21.25min2 Link to original

04 Illustration

Example - Conditional expectations from joint density

Conditional expectations from joint density

Suppose X and Y are random variables with joint density given by:

fX,Y(x,y)={1yex/yeyx,y(0,)0otherwise

Find E[X|Y=y].

Solution

(1) Derive the marginal density fY(y):

fY(y)0+1yex/yeydxex/yey|x=0ey

(2) Use fY(y) to compute fX|Y(x|y):

fX|Y(x|y)fX,Y(x,y)fY(y)1yex/yey(ey)11yex/y

(3) Use fX|Y(x|y) to calculate expectation conditioned on the variable event:

E[X|Y=y]+xfX|Y(x|y)dx0xyex/ydxy Link to original

05 Theory - extra (non-examinable)

Theory 2

Expectation conditioned by a random variable

Suppose X and Y are any two random variables. The expectation of X conditioned on Y is a random variable giving the typical value of X on the assumption that Y has value determined by an outcome of the experiment.

E[X|Y]=g(Y)whereg(y)=E[X|Y=y]

In other words, start by defining a function g(y):

g:yE[X|Y=y]

Now E[X|Y] is defined as the composite random variable g(Y).

Considered as a random variable, E[X|Y] takes an outcome sS, computes Y(s), sets y=Y(s), then returns the expectation of X conditioned on Y=y.

Notice that X is not evaluated at s, only Y is.

Because the value of E[X|Y] depends only on Y(s), and not on any additional information about s, it is common to represent a conditional expectation E[X|Y] using only the function g.


Iterated Expectation

E[E[X|Y]]=E[X]

Proof of Iterated Expectation, discrete case

E[E[X|Y]]=E[X|Y=]PY()=kkPX|Y(k|)PY()=kkPX,Y(k,)=kkPX(k)=E[X]
Link to original

06 Illustration

Example - Conditional Expectation Variable

Sum of random number of RVs

Let N denote the number of customers that enter a store on a given day.

Let Xi denote the amount spent by the ith customer.

Assume that E[N]=50 and E[X_i]=\ ParseError: Unexpected character: '\' at position 8: E[X_i]=\̲8foreachi$.

What is the expected total spend of all customers in a day?

Solution

A formula for the total spend is X=i=1NXi.

By Iterated Expectation, we know E[X]=E[E[X|N]].

Now compute E[X|N] as a function of N:

E[X|N=n]E[(i=1NXi)|N=n]E[(i=1nXi)|N=n]i=1nE[Xi|N=n]i=1nE[Xi]8n

Therefore g(n)=8n and g(N)=8N and E[X|N]=8N.

Then by Iterated Expectation, E[X]=E[8N]=8E[N]=\ ParseError: Unexpected character: '\' at position 18: …X]=E[8N]=8E[N]=\̲400$.

Link to original