Large-scale convex optimizationAdvances in randomized numerical linear algebra (randNLA) improve the runtime of a variety of fundamental linear algebraic operations. The resulting advances in low-rank matrix approximation have important implications throughout optimization: many subproblems admit speedups by using low-rank approximations to data matrices, problem variables, or Hessians. Our lab has used fast low rank approximations to accelerate cross-validation, dynamic optimization, optimal sensing, semidefinite programming, linear systems solvers, composite optimization, and conic optimization, and stochastic optimization. For example, we developed a low-rank preconditioner for solving linear systems (NystromPCG) and used it to speed up solution times for well-studied statistical learning problems such as LASSO, SVM, logistic regression by an order of magnitude (NysADMM and GeNIOS). Talks
Software
PapersGeNIOS: an (almost) second-order operator-splitting solver for large-scale convex optimization PROMISE: Preconditioned Stochastic Optimization Methods by Incorporating Scalable Curvature Estimates On the (linear) convergence of Generalized Newton Inexact ADMM SketchySGD: Reliable Stochastic Optimization via Robust Curvature Estimates NysADMM: faster composite convex optimization via low-rank approximation Randomized Nystr\"om Preconditioning A Strict Complementarity Approach to Error Bound and Sensitivity of Solution of Conic Programs On the simplicity and conditioning of low rank semidefinite programs Scalable Semidefinite Programming Randomized Sketching Algorithms for Low-Memory Dynamic Optimization An Optimal-Storage Approach to Semidefinite Programming using Approximate Complementarity kFW: A Frank-Wolfe style algorithm with stronger subproblem oracles Frank-Wolfe Style Algorithms for Large Scale Optimization Sketchy Decisions: Convex Low-Rank Matrix Optimization with Optimal Storage The Sound of APALM Clapping: Faster Nonsmooth Nonconvex Optimization with Stochastic Asynchronous PALM |