Fast and scalable Lasso via stochastic Frank–Wolfe methods with a convergence guarantee
作者:Emanuele Frandi, Ricardo Ñanculef, Stefano Lodi, Claudio Sartori, Johan A. K. Suykens
摘要
Frank–Wolfe (FW) algorithms have been often proposed over the last few years as efficient solvers for a variety of optimization problems arising in the field of machine learning. The ability to work with cheap projection-free iterations and the incremental nature of the method make FW a very effective choice for many large-scale problems where computing a sparse model is desirable. In this paper, we present a high-performance implementation of the FW method tailored to solve large-scale Lasso regression problems, based on a randomized iteration, and prove that the convergence guarantees of the standard FW method are preserved in the stochastic setting. We show experimentally that our algorithm outperforms several existing state of the art methods, including the Coordinate Descent algorithm by Friedman et al. (one of the fastest known Lasso solvers), on several benchmark datasets with a very large number of features, without sacrificing the accuracy of the model. Our results illustrate that the algorithm is able to generate the complete regularization path on problems of size up to four million variables in <1 min.
论文关键词:Frank–Wolfe algorithm, Lasso, Large-scale regression
论文评审过程:
论文官网地址:https://doi.org/10.1007/s10994-016-5578-4