EE364b - Convex Optimization II

Instructor: Mert Pilanci, pilanci@stanford.edu

EE364b is the same as CME364b and was originally developed by Stephen Boyd

Announcements

  • Homework 2 is available and due April 16. You will learn about subgradient descent, alternating projections, Sion's minimax theorem and experiment with group Lasso and signal recovery.

  • Homework 1 is available and due April 9. You will learn about subgradients/subdifferentials, automatic differentiation using Pytorch (and when it fails), generalized gradients (Clarke subdifferential) and some interesting facts about non-convexity in neural networks!

  • Annotated slides, original slides and animations shown in class are available on Canvas

  • We will have live Zoom lectures starting on March 30, 10:30-11:50am. Please see Canvas for the Zoom links.

  • Welcome to EE364b, Spring quarter 2020-2021.

Course description

Continuation of 364A. Subgradient, cutting-plane, and ellipsoid methods. Decentralized convex optimization via primal and dual decomposition. Monotone operators and proximal methods; alternating direction method of multipliers. Exploiting problem structure in implementation. Convex relaxations of hard problems. Global optimization via branch and bound. Robust and stochastic optimization. Applications in areas such as control, circuit design, signal processing, machine learning and communications. This class will culminate in a final project.

Prerequisites:

EE364a - Convex Optimization I