$\newcommand{\ones}{\mathbf 1}$

Least squares is a special case of convex optimization.

By and large, convex optimization problems can be solved efficiently.

Almost any problem you'd like to solve in practice is convex.

Convex optimization problems are attractive because they always have a unique solution.

In a device sizing problem the goal is to minimize power consumption subject to the total area not exceeding 50, as well as some timing and manufacturing constraints. Four candidate designs meet the timing and manufacturing constraints, and have power and area listed in the table below. \[ \begin{array}{lll} {\rm Design} & {\rm Power} & {\rm Area} \\ \hline {\rm A} & 10 & 50 \\ \hline {\rm B} & 8 & 55 \\ \hline {\rm C} & 10 & 45 \\ \hline {\rm D} & 11 & 50 \\ \hline \end{array} \]

Design B is better than design A.

Design C is better than design A.

Design D cannot be optimal.

Very roughly, how long would it take to solve a linear program with 100 variables and 1000 constraints on a computer capable of carrying out a billion floating point operations per second (1 gigaflop)?

Local optimization