An Efficient Support Vector Machine Learning Method with Second-Order Cone Programming for Large-Scale Problems
作者:Rameswar Debnath, Masakazu Muramatsu, Haruhisa Takahashi
摘要
In this paper we propose a new fast learning algorithm for the support vector machine (SVM). The proposed method is based on the technique of second-order cone programming. We reformulate the SVM's quadratic programming problem into the second-order cone programming problem. The proposed method needs to decompose the kernel matrix of SVM's optimization problem, and the decomposed matrix is used in the new optimization problem. Since the kernel matrix is positive semidefinite, the dimension of the decomposed matrix can be reduced by decomposition (factorization) methods. The performance of the proposed method depends on the dimension of the decomposed matrix. Experimental results show that the proposed method is much faster than the quadratic programming solver LOQO if the dimension of the decomposed matrix is small enough compared to that of the kernel matrix. The proposed method is also faster than the method proposed in (S. Fine and K. Scheinberg, 2001) for both low-rank and full-rank kernel matrices. The working set selection is an important issue in the SVM decomposition (chunking) method. We also modify Hsu and Lin's working set selection approach to deal with large working set. The proposed approach leads to faster convergence.
论文关键词:second-order cone programming, quadratic programming, Cholesky factorization, eigenvalue decomposition, support vector machine
论文评审过程:
论文官网地址:https://doi.org/10.1007/s10489-005-4609-9