Biostat 216 Homework 6+7

Due Dec 3 Friday @ 11:59pm

Submit a PDF (scanned/photographed from handwritten solutions, or converted from RMarkdown or Jupyter Notebook) to Gracescope on CCLE.

Eigenvalues and eigenvectors

  • Q1. Diagonalize (show the steps to find eigenvalues and eigenvectors) $$ \mathbf{A} = \begin{pmatrix} 2 & -1 \\ -1 & 2 \end{pmatrix} $$ and compute $\mathbf{X} \boldsymbol{\Lambda}^k \mathbf{X}^{-1}$ to prove the formula $$ \mathbf{A}^k = \frac 12 \begin{pmatrix} 1 + 3^k & 1 - 3^k \\ 1 - 3^k & 1 + 3^k \end{pmatrix}. $$
  • Q2. Suppose the same $\mathbf{X}$ diagonalize both $\mathbf{A}$ and $\mathbf{B}$. They have the same eigenvectors in $\mathbf{A} = \mathbf{X} \boldsymbol{\Lambda}_1 \mathbf{X}^{-1}$ and $\mathbf{B} = \mathbf{X} \boldsymbol{\Lambda}_2 \mathbf{X}^{-1}$. Prove that $\mathbf{A} \mathbf{B} = \mathbf{B} \mathbf{A}$.
  • Q3. Suppose the eigenvalues of a square matrix $\mathbf{A} \in \mathbb{R}^{n \times n}$ are $\lambda_1, \ldots, \lambda_n$. Show that $\det (\mathbf{A}) = \prod_{i=1}^n \lambda_i$. Hint: $\lambda_i$s are roots of the characteristic polynomial.
  • Q4. Ture of false. For each statement, indicate it is true or false and gives a brief explanation.

    1. If the columns of $\mathbf{X}$ (eigenvectors of a square matrix $\mathbf{A}$) are linearly independent, then (a) $\mathbf{A}$ is invertible; (b) $\mathbf{A}$ is diagonalizable; (c) $\mathbf{X}$ is invertible; (d) $\mathbf{X}$ is diagonalizable.

    2. If the eigenvalues of $\mathbf{A}$ are 2, 2, 5 then the matrix is certainly (a) invertible; (b) diagonalizable.

    3. If the only eigenvectors of $\mathbf{A}$ are multiples of $\begin{pmatrix} 1 \\ 4 \end{pmatrix}$, then the matrix has (a) no inverse; (b) a repeated eigenvalue; (c) no diagonalization $\mathbf{X} \boldsymbol{\Lambda} \mathbf{X}^{-1}$.

  • Q5. Let $\mathbf{A} \in \mathbb{R}^{m \times n}$ and $\mathbf{B} \in \mathbb{R}^{n \times m}$. Show that $\mathbf{A} \mathbf{B}$ and $\mathbf{B} \mathbf{A}$ share the same non-zero eigenvalues. Hint: examine the eigen-equations.

Positive definite matrices

  • Q6. Suppose $\mathbf{C}$ is positive definite and $\mathbf{A}$ has independent columns. Apply the energy test to show that $\mathbf{A}' \mathbf{C} \mathbf{A}$ is positive definite.
  • Q7. Suppose $\mathbf{S}$ is positive definite with eigenvalues $\lambda_1 \ge \lambda_2 \ge \cdots \ge \lambda_n$ with corresponding orthonormal eigenvectors $\mathbf{u}_1, \mathbf{u}_2, \ldots, \mathbf{u}_n$.

    1. What are the eigenvalues of the matrix $\lambda_1 \mathbf{I} - \mathbf{S}$? Is it positive semidefinite?
    2. How does it follow that $\lambda_1 \mathbf{x}'\mathbf{x} \ge \mathbf{x}' \mathbf{S} \mathbf{x}$ for every $\mathbf{x}$?
    3. Draw the conclusion: The maximum value of the Rayleigh quotient $$ R(\mathbf{x}) = \frac{\mathbf{x}'\mathbf{S}\mathbf{x}}{\mathbf{x}'\mathbf{x}} $$ is $\lambda_1$.
    4. Show that the maximum value of the Rayleigh quotient subject to the condition $\mathbf{x}\perp \mathbf{u}_1$ is $\lambda_2$. Hint: expand $\mathbf{x}$ in eigenvectors $\mathbf{u}_1,\ldots,\mathbf{u}_n$.
    5. Show that the maximum value of the Rayleigh quotient subject to the conditions $\mathbf{x}\perp \mathbf{u}_1$ and $\mathbf{x}\perp \mathbf{u}_2$ is $\lambda_3$.
    6. What is the maximum value of $\frac 12 \mathbf{x}' \mathbf{S} \mathbf{x}$ subject to the constraint $\mathbf{x}'\mathbf{x}=1$. Hint: write down the Lagrangian and set its derivative to zero.
    7. Let $\mathbf{A} \in \mathbb{R}^{m \times n}$ with largest singular value $\sigma_1$. Find the maximum value of $\frac{\|\mathbf{A} \mathbf{x}\|}{\|\mathbf{x}\|}$.
    8. Let $\mathbf{B}$ be a submatrix of $\mathbf{A} \in \mathbb{R}^{m \times n}$. Show that the largest singular value of $\mathbf{B}$ is always less than or equal to the largest singular value of $\mathbf{A}$. Hint: use Q7.7.

SVD

  • Q8. Find the closest rank-1 approximation (Frobenius norm or L2 norm) to these matrices $$ \mathbf{A} = \begin{pmatrix} 3 & 0 & 0 \\ 0 & 2 & 0 \\ 0 & 0 & 1 \end{pmatrix}, \quad \mathbf{A} = \begin{pmatrix} 0 & 3 \\ 2 & 0 \end{pmatrix}, \quad \mathbf{A} = \begin{pmatrix} 2 & 1 \\ 1 & 2 \end{pmatrix}, \quad \mathbf{A} = \begin{pmatrix} \cos \theta & - \sin \theta \\ \sin \theta & \cos \theta \end{pmatrix}. $$
  • Q9. Moore-Penrose inverse from SVD.
    1. With singular value decomposition $\mathbf{X} = \mathbf{U} \boldsymbol{\Sigma} \mathbf{V}'$, verify that $$ \mathbf{X}^+ = \mathbf{V} \boldsymbol{\Sigma}^+ \mathbf{U}' = \mathbf{V}_r \boldsymbol{\Sigma}_r^{-1} \mathbf{U}_r' = \sum_{i=1}^r \sigma_i^{-1} \mathbf{v}_i \mathbf{u}_i', $$ where $\boldsymbol{\Sigma}^+ = \text{diag}(\sigma_1^{-1}, \ldots, \sigma_r^{-1}, 0, \ldots, 0)$ and $r= \text{rank}(\mathbf{X})$, satifies the four properties of the Moore-Penrose inverse).
    2. Show that $\text{rank}(\mathbf{X}^+) = \text{rank}(\mathbf{X})$.
    3. Show that $\mathbf{X}^+ \mathbf{X}$ is the orthogonal projector into $\mathcal{C}(\mathbf{X}')$ and $\mathbf{X} \mathbf{X}^+$ is the orthogonal projector into $\mathcal{C}(\mathbf{X})$.
    4. Show that $\boldsymbol{\beta}^+ = \mathbf{X}^+ \mathbf{y}$ is a minimizer of the least squares criterion $f(\boldsymbol{\beta}) = \|\mathbf{y} - \mathbf{X} \boldsymbol{\beta}\|^2$. Hint: check $\boldsymbol{\beta}^+$ satisfies the normal equation $\mathbf{X}'\mathbf{X}\boldsymbol{\beta} = \mathbf{X}'\mathbf{y}$.
    5. Show that $\boldsymbol{\beta}^+ \in \mathcal{C}(\mathbf{X}')$.
    6. Show that if another $\boldsymbol{\beta}^\star$ minimizes $f(\boldsymbol{\beta})$, then $\|\boldsymbol{\beta}^\star\| \ge \|\boldsymbol{\beta}^+\|$. This says that $\boldsymbol{\beta}^+ = \mathbf{X}^+ \mathbf{y}$ is the least squares solution with smallest L2 norm. Hint: since both $\boldsymbol{\beta}^\star$ and $\boldsymbol{\beta}^+$ satisfy the normal equation, $\mathbf{X}'\mathbf{X} \boldsymbol{\beta}^\star = \mathbf{X}'\mathbf{X} \boldsymbol{\beta}^+$ and deduce that $\boldsymbol{\beta}^\star - \boldsymbol{\beta}^+ \in \mathcal{N}(\mathbf{X})$.

Optimization and multivariate calculus

  • Q10.
    1. Explain why the intersection $K_1 \cap K_2$ of two convex sets is a convex set.
    2. Prove that the maximum $F_3$ of two convex functions $F_1$ and $F_2$ is a convex function. Hint: What is the set above the graph of $F_3$?
  • Q11. Show that these functions are convex:
    1. Entropy $x \log x$.
    2. $\log (e^x + e^y)$.
    3. $\ell_p$ norm $\|\mathbf{x}\|_p = (|x_1|^p + |x_2|^p)^{1/p}$, $p \ge 1$.
    4. $\lambda_{\max}(\mathbf{S})$ as a function of the symmetric matrix $\mathbf{S}$. Hint: Q7.6 and Q10.2.
  • Q12. Minimize $f(x_1,x_2)= \frac 12 \mathbf{x}'\mathbf{S} \mathbf{x}= \frac 12 x_1^2 + 2 x_2^2$ subject to the constraint $\mathbf{A}'\mathbf{x}=x_1 + 3x_2 = b$.
    1. What is the Langrangian $L(\mathbf{x},\lambda)$ for this problem.
    2. What are the three equations "derivative of L=zero"?
    3. Solve these equations to find $\mathbf{x}^\star = (x_1^\star, x_2^\star)$ and the multiplier $\lambda^\star$.
    4. Verify that the derivative of the minimum cost is $\partial f^\star / \partial b = -\lambda^\star$.