Biostat 216 Final

Dec 9, 2021, 3pm-6pm, CHS 41-268

In class, closed-book, one page (letter size, double-sided) cheat sheet allowed.

Make sure to write your name and UID on your answer sheets. Also number the answer sheets.

  • Q1. (9pts). Let $$ \mathbf{S} = \begin{pmatrix} \cos \theta & - \sin \theta \\ \sin \theta & \cos \theta \end{pmatrix} \begin{pmatrix} 2 & 0 \\ 0 & 5 \end{pmatrix} \begin{pmatrix} \cos \theta & \sin \theta \\ -\sin \theta & \cos \theta \end{pmatrix}. $$ Find (a) the determinant of $\mathbf{S}$, (b) the trace of $\mathbf{S}$ (sum of diagonal entries), (c) the eigenvalues of $\mathbf{S}$, (d) the eigenvectors of $\mathbf{S}$, (e) the eigenvalues of $\mathbf{S}^5$, (f) the eigenvalues of $\mathbf{S} - 0.5 \mathbf{I}_2$, (g) a reason why $\mathbf{S}$ is positive definite, (h) the singular value decomposition (SVD) of $\mathbf{S}$, (i) a best rank-1 approximation (in terms of Frobenius or spectral norm) to $\mathbf{S}$.
  • Q2. (6pts)
    1. Show that the eigenvectors $\mathbf{x}_1$ and $\mathbf{x}_2$, corresponding to distinct eigenvalues $\lambda_1$ and $\lambda_2$, of a square matrix are linearly independent. Hint: proof by contradiction.
    2. Show that the eigenvectors $\mathbf{x}_1$ and $\mathbf{x}_2$, corresponding to distinct eigenvalues $\lambda_1$ and $\lambda_2$, of a symmetric matrix are orthogonal to each other. Hint: show $\mathbf{x}_2 \in \mathcal{N}(\mathbf{A}-\lambda_2 \mathbf{I})$ and $\mathbf{x}_1 \in \mathcal{C}(\mathbf{A}-\lambda_2 \mathbf{I})$.
  • Q3. (5pts) (Closest point theorem) Let $\mathcal{S}$ be a vector space in $\mathbb{R}^n$. Show that if $\mathbf{u}$ is the orthogonal projection of $\mathbf{y} \in \mathbb{R}^n$ into $\mathcal{S}$, then $$ \|\mathbf{y} - \mathbf{u}\|^2 \le \|\mathbf{y} - \mathbf{w}\|^2 $$ for all $\mathbf{w} \in \mathcal{S}$. In words, $\mathbf{u}$ is the closest point in $\mathcal{S}$ to $\mathbf{y}$. (In statistics, this result tells us least squares solution gives us the best fit to a data vector $\mathbf{y}$ using predictors in $\mathbf{X}$.)
  • Q4. (6pts) $\mathbf{A} \in \mathbb{R}^{15 \times 10}$ has rank $5$.

    1. Give the dimensions of the $\mathbf{U}$, $\boldsymbol{\Sigma}$, and $\mathbf{V}$ matrices in the singular value decomposition (SVD) of $\mathbf{A}$.

    2. Give the dimensions of the $\mathbf{U}$, $\boldsymbol{\Sigma}$, and $\mathbf{V}$ matrices in the reduced-form (or thin) SVD of $\mathbf{A}$.

    3. Give the dimensions of the $\mathbf{U}$, $\boldsymbol{\Sigma}$, and $\mathbf{V}$ matrices in the full SVD of $\mathbf{A}$.

    4. Give the dimensions of the $\mathbf{Q}$ and $\boldsymbol{\Lambda}$ matrices in the spectral decomposition of the Gram matrix $\mathbf{A}'\mathbf{A} = \mathbf{Q} \boldsymbol{\Lambda} \mathbf{Q}'$.

    5. Give the dimensions of the $\mathbf{Q}$ and $\boldsymbol{\Lambda}$ matrices in the spectral decomposition of the Gram matrix $\mathbf{A}\mathbf{A}' = \mathbf{Q} \boldsymbol{\Lambda} \mathbf{Q}'$.

    6. Give the dimension of the Moore-Penrose generalized inverse $\mathbf{A}^+$.

  • Q5. (5pts) Find the orthogonal projection of the point $\mathbf{1}_3$ into the plane spanned by the vectors $\begin{pmatrix} 1 \\ 1 \\ 0 \end{pmatrix}$ and $\begin{pmatrix} -2 \\ 2 \\ 1 \end{pmatrix}$. Hint: You can either use $\mathbf{P} = \mathbf{X} (\mathbf{X}'\mathbf{X})^{-1} \mathbf{X}'$ or $\mathbf{P} = \mathbf{Q} \mathbf{Q}'$ where columns of $\mathbf{Q}$ are an orthonormal basis.
  • Q6. (5pts) Let $$ \mathbf{A} = \begin{pmatrix} 2 & 0 & 0 \\ 6 & 1 & 0 \\ -8 & 5 & 3 \end{pmatrix} \begin{pmatrix} 2 & 6 & -8 \\ 0 & 1 & 5 \\ 0 & 0 & 3 \end{pmatrix}. $$
    1. Is $\mathbf{A}$ a positive definite matrix? Why?
    2. Calculate $\det (\mathbf{A})$, $\det (\mathbf{A}^3)$, $\det (\mathbf{A}^{-1})$, and $\det (-2\mathbf{A})$.
  • Q7. (9pts) (Rayleigh quotient) Suppose $\mathbf{S} \in \mathbb{R}^{n \times n}$ is positive definite with eigenvalues $\lambda_1 > \lambda_2 \ge \cdots \ge \lambda_n > 0$ and corresponding eigenvectors $\mathbf{u}_1, \mathbf{u}_2, \ldots, \mathbf{u}_n$.
    1. Show that the maximum value of the Rayleigh quotient $$ R(\mathbf{x}) = \frac{\mathbf{x}' \mathbf{S} \mathbf{x}}{\mathbf{x}' \mathbf{x}} $$ is $\lambda_1$.
    2. Show that the maximum value of the Rayleigh quotient $R(\mathbf{x})$, subjec to the constraint $\mathbf{x} \perp \mathbf{u}_1$, is $\lambda_2$.
    3. Show that the minimum value of the Rayleigh quotient $R(\mathbf{x})$ is $\lambda_n$.
  • Q8. (5pts) Fundamental theorem of ranks.
    1. State the rank-nullity theorem. You don't need to prove it.
    2. Show that $\mathcal{N}(\mathbf{A}'\mathbf{A}) = \mathcal{N}(\mathbf{A})$.
    3. Show the fundamental theorem of ranks: $\text{rank}(\mathbf{A}) = \text{rank}(\mathbf{A}'\mathbf{A})$. Hint: use 1 and 2.
  • Q9. (5pts) Let $\mathbf{A}$ and $\mathbf{B}$ be two positive semidefinite matrices of same size. Show that $\alpha \mathbf{A} + \beta \mathbf{B}$ is positive semidefinite for any $\alpha, \beta \ge 0$. Give a counter-example to show that it is not true if $\alpha$ and $\beta$ are allowed to be negative.
  • Q10. (5pts) Let $\mathbf{A} = \mathbf{x} \mathbf{y}'$, where $\mathbf{x} \in \mathbb{R}^m$ and $\mathbf{y} \in \mathbb{R}^n$ are non-zero vectors.
    1. What is the rank of $\mathbf{A}$?
    2. Find the reduced-form SVD of $\mathbf{A}$.
    3. Find the Moore-Penrose inverse of $\mathbf{A}$.