Practice problems

Penrose quasi-inverse: smallest LLS solution

Given an LLS problem with and , show that the Penrose quasi-inverse solution is the LLS solution of smallest length. Show that this solution lies in the cokernal of . (Hint: use the SVD of .)

Problem 13-04

Optimizing to find singular vectors

Show that the second largest singular value of is equal to the maximum of where varies over all unit vectors which are orthogonal to , where is a right singular vector corresponding to the largest singular value of .

(Notation clarification: if the singular values are listed in order of size , then the largest singular value is , the second largest is , and the right singular vector in the problem is satisfying .)

Problem 14-05

Penrose quasi-inverse

For this problem, use the definition of based on the SVD of .

  • (a) Verify that and that for the example studied in the section on quasi-inverse. Think about how and why this happens in terms of the SVD.
  • (b) Show that for any matrix, is the projection of onto the image of .
  • (c) Show that for any matrix, and .

It may be helpful (though not required) for this problem to use the presentation of SVD in the form as in Packet 13. In order to use this, figure out what the corresponding presentation of the SVD of should be.

Problem 14-06

LLS uniqueness

  • (a) Show that: if and only if . (Hint: use the fact that .)
  • (b) Show that: is invertible if and only if has independent columns. (Warning: need not be square. Hint: consider the dimensions of kernels, and use (a) as well as the rank-nullity theorem.)
  • (c) Explain why the LLS problem for and has a unique solution if and only if the columns of are independent. (Use (b).)
  • (d) Continuing from (b), show that: . (Hint: apply rank-nullity to both matrices. Notice that and have the same number of columns.)

You may observe that (a) implies , while (d) means . In other words, and always have the same size kernels and the same size images. This makes sense in terms of the SVD if you think about kernels, cokernels, and images generated by the orthonormal basis vectors ’s and ’s, writing as in the previous problem (keeping careful track of which ’s are zero or nonzero!).

Finding a projection to the space perpendicular to given vectors

Let:

  • Find the orthogonal complement of the span , written as a span.
  • Find the projection of to that complement using normal equations.
  • Find a matrix that performs the projection to this orthogonal complement using an augmented matrix.
Problem 14-03

Fitting a plane to four points

Consider the four data points in D space:

Find the parameters for which the plane defined by best fits these data points.

You may notice that the data points correspond to heights at the four corners of a square. Normally a plane is defined by 3 points (non-colinear) through which it passes. No plane passes through these four given points. But there is a plane that minimizes the sum of squares of vertical errors.

LLS uniqueness

Suppose we are given data . The linear least squares problem has a unique solution if and only if the design matrix column vectors are independent. (You may use this fact from the previous problem.)

  • (a) Suppose we are finding the line of best fit for the given data. (Model: .) Show that as long as we can find and with , then there is a unique best fit line for this data.
  • (b) Suppose we are finding the parabola of best fit for the given data. (Model: .) Suppose that , , and are all distinct values. Show that there is a unique best fit parabola for this data.

LLS summation curve model

Consider the data points:

Find the parameters of best fit for the model . Clearly identify your design matrix, observation vector, and parameter vector. Compute the error vector and the total error.