Review: True/False

TRUE or FALSE:

(a) Suppose . It is possible that the correlation coefficient .

(b) Suppose . It is possible that and have a strong linear relationship.

(c) Suppose . It is possible that and are not independent.

(d) Suppose X and Y are not independent. It is possible that is equal to 0.

(e) Suppose X and Y are independent. must be equal to 0.

Review: Conditional probability

Review

Recall some items related to conditional probability.

Conditioning definition:

Multiplication rule:

Division into Cases / Total Probability:

Link to original

Conditional distribution

01 Theory

Theory 1

Conditional distribution - fixed event

Suppose is a random variable, and suppose . The distribution of conditioned on describes the probabilities of values of given knowledge that .

Discrete case:

Continuous case:

There is also a conditional CDF, of which this conditional PDF is the derivative:

The Law of Total Probability has versions for distributions:

Conditional distribution - variable event

Suppose and are any two random variables. The distribution of conditioned on describes the probabilities of values of in terms of , given knowledge that .

Discrete case:

Continuous case:

Remember: is the probability that “ and .”

Sometimes it is useful to have the formulas rewritten like this:

Extra - Deriving

The density ought to be such that gives the probability of , given knowledge that . Calculate this probability:

Link to original

02 Illustration

Example - Conditional PMF, variable event, via joint density

Conditional PMF, variable event, via joint density

Suppose and have joint PMF given by:

Find and .

Solution

Marginal PMFs:

Assuming or , for each we have:

Assuming , , or , for each we have:

Link to original

Conditional expectation

03 Theory

Theory 1

Expectation conditioned by a fixed event

Suppose is a random variable and . The expectation of conditioned on describes the typical value of given the hypothesis that is known.

Discrete case:

Continuous case:

Conditional variance:

Division into Cases / Total Probability applied to expectation:

Linearity of conditional expectation:

Extra - Proof: Division of Expectation into Cases

We prove the discrete case only.

  1. Expectation formula:
  1. Division into Cases for the PMF:
  1. Substitute in the formula for :

Expectation conditioned by a variable event

Suppose and are any two random variables. The expectation of conditioned on describes the typical of value of in terms of , given the hypothesis that is known.

Discrete case:

Continuous case:

Link to original

05 Illustration

Example - Conditional PMF, fixed event, expectation

Conditional PMF, fixed event, expectation

Suppose measures the lengths of some items and has the following PMF:

Let be the event that .

(a) Find the conditional PMF of given that is known.

(b) Find the conditional expected value and variance of given .

Solution

(a)

Conditional PMF formula with plugged in:

Compute by adding cases:

Divide nonzero PMF entries by :


(b)

Find :

Find :

Find using “short form” with conditioning:

Link to original

04 Theory - extra

Theory 2

Expectation conditioned by a random variable

Suppose and are any two random variables. The expectation of conditioned on is a random variable giving the typical value of on the assumption that has value determined by an outcome of the experiment.

In other words, start by defining a function :

Now is defined as the composite random variable .

Considered as a random variable, takes an outcome , computes , sets , then returns the expectation of conditioned on .

Notice that is not evaluated at , only is.

Because the value of depends only on , and not on any additional information about , it is common to represent a conditional expectation using only the function .


Iterated Expectation

Proof of Iterated Expectation, discrete case

Link to original

05 Illustration - extra

Example - Conditional expectations from joint density

Conditional expectations from joint density

Suppose and are random variables with joint density given by:

Find . Use this to compute .

Solution

(1) Derive the marginal density :


(2) Use to compute :


(3) Use to calculate expectation conditioned on the variable event:


(4) Apply Iterated Expectation:

Set . By Iterated Expectation, we know that . Therefore:

Notice that , so , and Iterated Expectation says that .

Link to original

Example - Flip coin, choose RV

Flip coin, choose RV

Suppose and represent two biased coins, giving 1 for heads and 0 for tails.

Here is the experiment:

  1. Flip a fair coin.
  2. If heads, flip the coin; if tails, flip the coin.
  3. Record the outcome as .

What is ?

Solution

Let describe the fair coin. Then:

Link to original

Example - Sum of random number of RVs

Sum of random number of RVs

Let denote the number of customers that enter a store on a given day.

Let denote the amount spent by the customer.

Assume that and 8i$.

What is the expected total spend of all customers in a day?

Solution

A formula for the total spend is .

By Iterated Expectation, we know .

Now compute as a function of :

Therefore and and .

Then by Iterated Expectation, 400$.

Link to original