Robust non-convex least squares loss function for regression with outliers
作者:
Highlights:
•
摘要
In this paper, we propose a robust scheme for least squares support vector regression (LS-SVR), termed as RLS-SVR, which employs non-convex least squares loss function to overcome the limitation of LS-SVR that it is sensitive to outliers. Non-convex loss gives a constant penalty for any large outliers. The proposed loss function can be expressed by a difference of convex functions (DC). The resultant optimization is a DC program. It can be solved by utilizing the Concave–Convex Procedure (CCCP). RLS-SVR iteratively builds the regression function by solving a set of linear equations at one time. The proposed RLS-SVR includes the classical LS-SVR as its special case. Numerical experiments on both artificial datasets and benchmark datasets confirm the promising results of the proposed algorithm.
论文关键词:Least squares support vector regression,Robust,Iterative strategy,Loss function,DC program
论文评审过程:Received 16 January 2014, Revised 5 August 2014, Accepted 5 August 2014, Available online 12 August 2014.
论文官网地址:https://doi.org/10.1016/j.knosys.2014.08.003