Asymmetric and robust loss function driven least squares support vector machine

作者:

Highlights:

摘要

Least squares support vector machine (LSSVM) considerably simplifies problem solving, however, there are restrictions. The first is that it treats samples on both sides of the proximal hyperplane equally and does not differentiate them based on locations; the second is that it is sensitive to noise and outliers. To address the aforementioned issues, this paper proposes an asymmetric robust least squares support vector machine termed QTLS by combining Quadratic Type Squared Error Loss Function (QTSELF) with LSSVM, in which QTSELF is introduced into machine learning for the first time. On the one hand, QTLS can impose various penalties on samples according to their locations and place greater emphasis on samples that are susceptible to misclassification. On the other hand, it improves the model robustness by imposing a tiny penalty on noise or outliers located far from the proximal hyperplane. Using Rademacher complexity theory, we investigate the generalization capacity of QTLS. The RMSProp technique is utilized to solve both linear and nonlinear QTLS. Extensive experiments indicate the effectiveness of QTLS in addressing binary classification problems.

论文关键词:Least squares support vector machine,Asymmetric,Robustness,Loss function,Gradient descent method

论文评审过程:Received 8 July 2022, Revised 4 October 2022, Accepted 5 October 2022, Available online 10 October 2022, Version of Record 21 October 2022.

论文官网地址:https://doi.org/10.1016/j.knosys.2022.109990