Theory 1
By composing any function
Notation
The derived random variable
may be written “ ”.
Expectation of derived variables
Discrete case:
(Here the sum is over all possible values
of , i.e. where .) Continuous case:
Notice: when applied to outcome
is the output of is the output of
The proofs of these formulas are tricky because we must relate the PDF or PMF of
Proof - Discrete case - Expectation of derived variable
Linearity of expectation
For constants
and : For any
and on the same probability model:
Exercise - Linearity of expectation
Using the definition of expectation, verify both linearity formulas for the discrete case.
Be careful!
Usually
. For example, usually
. We distribute
over sums but not products (unless the factors are independent).
Variance squares the scale factor
For constants
and :
Thus variance ignores the offset and squares the scale factor. It is not linear!
Proof - Variance squares the scale factor
Extra - Moments
The
moment of is defined as the expectation of : Discrete case:
Continuous case:
A central moment of
is a moment of the variable :
The data of all the moments collectively determines the probability distribution. This fact can be very useful! In this way moments give an analogue of a series representation, and are sometimes more useful than the PDF or CDF for encoding the distribution.
Theory 2
Suppose we are given the PDF
What is the PDF
PDF of derived
The PDF of
is not (usually) equal to .
Relating PDF and CDF
When the CDF of
is differentiable, we have:
Therefore, if we know
(1) Find
Compute
Now remember that
(2) Find
When
(3) Find
Method of differentials
Change variables: The measure for integration is
. Set so and . Thus . So the measure of integration in terms of is .