Robust distance metric optimization driven GEPSVM classifier for pattern classification

作者:

Highlights:

摘要

Proximal support vector machine via generalized eigenvalues (GEPSVM) is one of the most successful methods for classification problems. However, GEPSVM is vulnerable to outliers since it learns classifiers based on the squared L2-norm distance without a specific strategy to deal with the outliers. Motivated by existing studies that improve the robustness of GEPSVM via the L1-norm distance or not-squared L2-norm distance formulation, a novel GEPSVM formulation that minimizes the p-order of L2-norm distance is proposed, namely, L2,p-GEPSVM. This formulation weakens the negative effects of both light and heavy outliers in the data. An iterative algorithm is designed to solve the general L2,p-norm distance minimization problems and rigorously prove its convergence. In addition, we adjust the parameters of L2,p-GEPSVM to balance the accuracy and training time. This is especially useful for larger datasets. Extensive results indicate that the L2,p-GEPSVM improves the classification performance and robustness in various experimental settings.

论文关键词:Classification problem,Distance metric learning,Outliers and noises,Robust L2p-GEPSVM method,Squared L2-norm distance

论文评审过程:Received 24 December 2020, Revised 23 February 2022, Accepted 6 May 2022, Available online 17 May 2022, Version of Record 17 May 2022.

论文官网地址:https://doi.org/10.1016/j.patcog.2022.108779