EE364b - Convex Optimization II

Instructor: Mert Pilanci, pilanci@stanford.edu

EE364b is the same as CME364b and was originally developed by Stephen Boyd

Announcements

  • Homework 6 is available and due May 14. You will learn about decomposition methods, distributed regression and multi-commodity flow problems.

  • Homework 5 is available and due May 7. You will learn about cutting plane methods and interior point method.

  • Homework 4 is available and due April 30. You will learn about Bregman divergence, mirror descent and dual averaging.

  • Homework 3 is available and due April 23. You will learn about stochastic portfolio optimization and stochastic linear solvers.

  • Homework 2 is available and due April 16. You will learn about subgradient descent, alternating projections, Sion's minimax theorem and experiment with group Lasso and signal recovery.

  • Homework 1 is available and due April 9. You will learn about subgradients/subdifferentials, automatic differentiation using Pytorch (and when it fails), generalized gradients (Clarke subdifferential) and some interesting facts about non-convexity in neural networks!

  • Annotated slides, original slides and animations shown in class are available on Canvas

  • We will have live Zoom lectures starting on March 30, 10:30-11:50am. Please see Canvas for the Zoom links.

  • Welcome to EE364b, Spring quarter 2020-2021.

Course description

Continuation of 364A. Subgradient, cutting-plane, and ellipsoid methods. Decentralized convex optimization via primal and dual decomposition. Monotone operators and proximal methods; alternating direction method of multipliers. Exploiting problem structure in implementation. Convex relaxations of hard problems. Global optimization via branch and bound. Robust and stochastic optimization. Applications in areas such as control, circuit design, signal processing, machine learning and communications. This class will culminate in a final project.

Prerequisites:

EE364a - Convex Optimization I