$\newcommand{\ones}{\mathbf 1}$

Least-Squares

In practical applications of least-squares polynomial fitting, it is always best to use the highest-order polynomial that is computationally feasible.
  1. True
    Incorrect.
  2. False
    Correct!

Least-squares approximate solution of overdetermined equations

Suppose $b=Ax+v$, where $x \in \mathbf{R}^n$ is some set of parameters you wish to estimate, $b \in \mathbf{R}^m$ is a set of measurements, and $v$ represents a noise. We assume $m>n$. Consider an estimator of the form $\hat{x} = Bb$.

Least-Squares Data Fitting
Least-squares function fitting works well for interpolation, but should never be used for extrapolation.
  1. True
    Incorrect.
  2. False
    Correct!

Regularization
Regularized least-squares, i.e., choosing $x$ to minimize $\|Ax−b\|^2 + \mu\|x\|^2, \text{with } \mu > 0$
  1. can always be done, even when A is not wide
    Correct!
  2. fails when A is not skinny
    Incorrect.
  3. requires only that A is nonzero
    Incorrect.

Suppose that $x$ minimizes $J_1(x) + \mu J_2(x)$, for some value of $\mu > 0$, but you'd like to find a point with a smaller value of $J_2$, if possible. You should
  1. decrease the parameter $\mu$ and minimize $J_1+\mu J_2$
    Incorrect.
  2. increase the parameter $\mu$ and minimize $J_1+\mu J_2$
    Correct!
  3. minimize $J_1 + (1/\mu)J_2$
    Incorrect.
  4. minimize $J_1 − \mu J_2$
    Incorrect.