Valley-loss regular simplex support vector machine for robust multiclass classification

作者:

Highlights:

摘要

Noise and outlier data processing are important issues to support vector machine (SVM). Although the pinball-loss SVM (Pin-SVM) and ramp-loss SVM (Ramp-SVM) are able to deal with the feature noise and outlier labels respectively, neither can handle both and promoting them from binary-classification to multiclass classification usually requires partitioning strategies. Since regular simplex support vector machine (RSSVM) has been proposed as a novel all-in-one K-classification model with clear advantages over partitioning strategies, developing a novel loss function with feature noise robustness and outlier labels insensitivity meanwhile and embedding it into the framework of RSSVM is potentially promising. In this paper, a newly proposed valley-loss regular simplex support vector machine (V-RSSVM) for robust multiclass classification is presented. Inheriting the merits of both the pinball-type loss and ramp-type loss, valley-loss enjoys not only the robustness to feature noise and outlier labels but also excellent sparseness. To train the V-RSSVM fast, a Concave–Convex Procedure (CCCP) assisted sequential minimization optimization (SMO)-type solver and a speeding up oriented initial solution strategy were developed. We also investigated the robustness, generalization error bound and sparseness of V-RSSVM in theory. Numerical results on twenty-five real-life data sets verify the effectiveness of our proposed V-RSSVM model.

论文关键词:Feature noise and outlier labels,Robust K-class classifier,Sparseness,Valley-loss function,Regular simplex support vector machine

论文评审过程:Received 12 June 2020, Revised 13 January 2021, Accepted 16 January 2021, Available online 24 January 2021, Version of Record 4 February 2021.

论文官网地址:https://doi.org/10.1016/j.knosys.2021.106801