Mean square error

06 Theory - Minimum mean square error

Theory 1 - Minimum mean square error

Suppose our problem is to estimate or guess or predict the value of a random variable in one run of the experiment. Assume we have the distribution of . Which value do we choose?

There is no single best answer to this question. The best answer is a function of additional factors in the problem context.

One method is to pick a value where the PMF or PDF of is maximal. This is a value of highest probability. (There may be more than one.)

Another method is to pick the expected value .

For the normal distribution, or any symmetrical distribution, these are the same value. For most distributions they are not the same value.


Mean square error

Given an estimate for a random variable , the mean square error (MSE) of is:

The MSE quantifies the typical (square of the) error, meaning the difference between the true value and the estimate . The expected value calculates the typical value of this error.

Other error estimates are reasonable and useful in niche contexts. For example, or . They are not frequently used, so we do not consider their theory further.


In problem contexts where large errors are more costly than small errors (many real problems), the most likely value of (point with maximal PDF) may fare poorly as an estimate.

It turns out the expected value also happens to be the value that minimizes the MSE.

Minimal mean square error

Given a random variable , its expectation provides the estimate with minimal mean square error.

The MSE error itself of :

Proof that gives minimal MSE

Expand the MSE error:

Minimize this parabola. Differentiate:

Find zeros:


When the estimate is made in the absence of information (besides the distribution of ), it is called a blind estimate. Therefore, is the blind minimal MSE estimate, and is the error of this estimate.

In the presence of additional information, namely that event is known, then the MSE estimate is and the error of this estimate is .

The MSE estimate can also be conditioned on another variable, say .

Minimal MSE of given

The minimal MSE estimate of given another variable :

The error of this estimate is , which equals .

Notice that the minimal MSE of given can be used to define a random variable:

This variable is a derived variable of given by post-composition with the function .

The variable provides the minimal MSE estimates of when experimental outcomes are viewed as providing the information of only, and the model is used to derive estimates of from this information.

Link to original

07 Illustration

Example - Minimal MSE estimate given PMF, given fixed event

Minimal MSE estimate given PMF

Suppose has the following PMF:

12345
0.150.280.260.190.13

Find the minimal MSE estimate of , given that is even. What is the error of this estimate?

Solution

The minimal MSE given is just where .

First compute the conditional PMF:

Therefore:

The error is:

Link to original

Exercise - Minimal MSE estimate from joint PDF

Minimal MSE estimate from joint PDF

Here is the joint PDF of and :

Find the minimal MSE estimate of in terms of .

What is the estimate of when ? When ?

Answer

Link to original

08 Theory - Line of minimal MSE

Theory 2 - Line of minimal MSE

Linear approximation is very common in applied math.

One could consider the linearization of (its tangent line) instead of the exact function .

Instead, one can minimize the MSE over all possible linear functions of . The line with minimal MSE is called the linear estimator.

Line of minimal MSE

Let be the line . Let .

The mean square error (MSE) of is:

The linear estimator is the line with minimal MSE, and it is:

The minimal error value is:

The variable of minimal error, , is uncorrelated with .

Slope and

Notice:

Thus, is the slope of the minimal MSE line for standardized variables and .

In each graph, and .

The line of minimal MSE is the “best fit” line, .

Link to original

09 Illustration

Example - Estimating on a variable interval

Estimating on a variable interval

Suppose that and suppose .

(a) Find (b) Find (c) Find

Solution

(a) Find .

We know .

Given , so is uniform on , we have .


(b) Find .

(1) We know .

To compute this function, we calculate a sequence of densities.


(2) We know and . From these we derive the joint distribution :

Now extract the marginal :

Now deduce the conditional :


(3) Then:

So .


(c) Find .

(1) We need all the basic statistics.

because .

.

using the marginal PDF on . (IBP and L’Hopital are needed.)

also using the marginal .


(2) using , namely:

From this we infer and .


(3) Hence:

Thus:

Link to original

Exercise - Line of minimal MSE given joint PDF

Line of minimal MSE given joint PDF

Here is the joint PDF of and :

Find the line giving the linear MSE estimate of in terms of .

What is the expected error of this line, ?

What is the estimate of when ? When ?

Answer

Link to original