An Efficient Method for Large-Scale l1-Regularized Convex Loss Minimization

K. Koh, S.-J. Kim, and S. Boyd

Proceedings IEEE Information Theory and Applications Workshop, pages 223-230, January 2007.

Convex loss minimization with l1 regularization has been proposed as a promising method for feature selection in classification (e.g., l1-regularized logistic regression) and regression (e.g., l1-regularized least-squares). In this paper we describe an efficient interior-point method for solving large-scale l1-regularized convex loss minimization problems, that uses a presconditioned conjugate gradient method to compute the search step. The method can solve very large problems. For example, the method can solve an l1-regularized logistic regression problem with a million features and examples (e.g., the 20 Newsgroups data set) in a few minutes, on a PC.