Suppose the PDF of is given by:
Find the CDF and PDF of .
Solution
Start by finding the CDF of , . First, let us compute the CDF of , :
Thus, we have that
Finally, differentiate to find the density function of , :
02: PDF of min and max
Suppose and and these variables are independent. Find:
(a) The PDF of
(b) The PDF of
Solution
(a)
First, we define the PDF’s and CDF’s of and :
Now, since, by definition, . By independence, . Thus, we have: and thus,
(b)
Similarly, for , we have that . Thus, we have by independence. Thus, we have that:
Thus, the density function is given by:
Sums of random variables
03: PDF of sum from joint PDF
Suppose the joint PDF of and is given by:
Find the PDF of .
Solution
Let . We want to find , which we shall do using the convolution formula. Loosely, we have that for acceptable values of and .
First, consider the range of : Since and , we have that . Thus, we need only concern ourselves with the case when .
Now that we have a range for , we must now find acceptable values of . Since both and , we have that . However, , by the condition for the JPDF given above. Thus, .
Similarly, and . Solving the second equation, we have that . Thus, . Since , , . Thus, we can restrict our condition to .
Now that we have bounds, we can finally apply the convolution formula:
We now take cases to deal with the upper bound: when , , and so our upper bound is . If , and , so our upper bound is . Plugging these values in and evaluating, we have our density function:
04: Poisson plus Bernoulli
Suppose that:
and are independent.
Find a formula for the PMF of .
Apply your formula with and to find .
Solution
We have that
Let . Then . Now, since only if , we have that . Thus,
Plugging in , , and into our PDF above, we have that .
05: Convolution for uniform distributions over intervals
Suppose that:
and are independent
Find the PDF of .
Solution
Define .
We know that
Note that the range of is .
Divide into cases.
:
:
:
.
Formalize final answer.
Misplaced &0 & w < a - c, w > b + d \\ \frac{w - a - c}{(b-a)(d-c)} & a + c \leq w \leq b +c \\ \frac{1}{d-c} & b + c \leq w \leq a + d \\ \frac{b + d - w}{(b - a)(d - c) } & a + d \leq w \leq b + d \end{cases}$$ #### 06: Sums of normals (a) Suppose $X,\,Y\sim\mathcal{N}(\mu,\sigma^2)$ are independent variables. Find the values of $\mu$ and $\sigma$ for which $X+X\sim X+Y$, or prove that none exist. (b) Suppose $\mu=0$, $\sigma=1$ in part (a). Find $P[X>Y+2]$. (c) Suppose $X\sim\mathcal{N}(0,\sigma_X)$ and $Y\sim\mathcal{N}(0,\sigma_Y)$. Find $P[X-3Y>0]$. **Solution** (a) 1. Suppose $X,Y\sim\mathcal{N}(\mu,\sigma^2)$ and $X,Y$ are independent (since they have the same distribution, they are called independent identically distributed, or IID, random variables). Now suppose that $X+X\sim X+Y$. Notice that if $X+X\sim X+Y$, then $\mathbb{E}(X+X)=\mathbb{E}(X+Y)$ and $\text{Var}(X+Y)=\text{Var}(X+X)$. 2. For the first condition, we have $\mathbb{E}(X+X)=\mathbb{E}(X)+\mathbb{E}(X)=2\mu=\mathbb{E}(X)+\mathbb{E}(Y)=\mathbb{E}(X+Y)$, and thus the first condition holds for any $\mu\in\mathbb{R}$. 3. For the second condition, by independence, we must have $\text{Var}(X+X)=\text{Var}(2X)=\text{Var}(X+Y)=\text{Var}(X)+\text{Var}(Y)$. Then we must have $4\text{Var}(X)=4\sigma^2=\sigma+\sigma=2\sigma^2$. Since $\sigma\in\mathbb{R}$, we must have that $\sigma=0$ as the only solution. 4. Thus, $\mu\in\mathbb{R}$ and $\sigma=0$, which signals that $X,Y$ are constants, are the only values that satisfy the given condition. (b) 1. Define $W = X - Y -2$. - The mean is $\mu_{W} = \mu - \mu - 2 = -2$. - The variance is $\sigma^{2}_{W} = \sigma^{2} + \sigma^{2} = 2$ - Thus, $W \sim \mathcal{N}(-2, 2)$. 2. Note that $P[X > Y + 2] = P[W > 0]$. Standardize $W$ and use the lookup table. $$\begin{align*}P[W > 0] &= P\left[Z > \frac{2}{\sqrt{2}}\right] \\ &= 1 - P\left[Z < \frac{2}{\sqrt{2}}\right] \\ &\approx 1 - \Phi(1.4142) \\ &\approx 1 - 0.9213 = 0.0787 \end{align*}$$ (c) 1. Let $W = X - 3Y$. - $\mu_{W} = 0$. - $\sigma^{2}_{W} = \sigma^{2}_{X} + 3^{2}\sigma_{Y}^{2}$. 2. Note that $P[X - 3Y > 0] = P[W > 0]$. Standardize $W$ and use the lookup table. $$\begin{align*} P[W > 0] &= P[Z > 0] = 0.5 \end{align*}$$