A general soft method for learning SVM classifiers with L1-norm penalty

作者:

Highlights:

摘要

Based on the geometric interpretation of support vector machines (SVMs), this paper presents a general technique that allows almost all the existing L2-norm penalty based geometric algorithms, including Gilbert's algorithm, Schlesinger–Kozinec’s (SK) algorithm and Mitchell–Dem’yanov–Malozemov’s (MDM) algorithm, to be softened to achieve the corresponding learning L1-SVM classifiers. Intrinsically, the resulting soft algorithms are to find ε-optimal nearest points between two soft convex hulls. Theoretical analysis has indicated that our proposed soft algorithms are essentially generalizations of the corresponding existing hard algorithms, and consequently, they have the same properties of convergence and almost the identical cost of computation. As a specific example, the problem of solving ν-SVMs by the proposed soft MDM algorithm is investigated and the corresponding solution procedure is specified and analyzed. To validate the general soft technique, several real classification experiments are conducted with the proposed L1-norm based MDM algorithms and numerical results have demonstrated that their performance is competitive to that of the corresponding L2-norm based algorithms, such as SK and MDM algorithms.

论文关键词:Support vector machines,Classification,ν-SVMs,Nearest points,Gilbert's algorithms,Schlesinger–Kozinec's algorithms,Mitchell–Dem’yanov–Malozemov's algorithms,Soft convex hulls

论文评审过程:Received 14 April 2006, Revised 25 May 2007, Accepted 16 August 2007, Available online 21 August 2007.

论文官网地址:https://doi.org/10.1016/j.patcog.2007.08.004