Training Lagrangian twin support vector regression via unconstrained convex minimization

作者:

Highlights:

摘要

In this paper, a new unconstrained convex minimization problem formulation is proposed as the Lagrangian dual of the 2-norm twin support vector regression (TSVR). The proposed formulation leads to two smaller sized unconstrained minimization problems having their objective functions piece-wise quadratic and differentiable. It is further proposed to apply gradient based iterative method for solving them. However, since their objective functions contain the non-smooth ‘plus’ function, two approaches are taken: (i) either considering their generalized Hessian or introducing a smooth function in place of the ‘plus’ function, and applying Newton–Armijo algorithm; (ii) obtaining their critical points by functional iterative algorithm. Computational results obtained on a number of synthetic and real-world benchmark datasets clearly illustrate the superiority of the proposed unconstrained Lagrangian twin support vector regression formulation as comparable generalization performance is achieved with much faster learning speed in accordance with the classical support vector regression and TSVR.

论文关键词:Generalized Hessian,Gradient based iterative methods,Smooth approximation,Support vector regression,Twin support vector regression,Unconstrained convex minimization

论文评审过程:Received 3 June 2013, Revised 19 January 2014, Accepted 19 January 2014, Available online 27 January 2014.

论文官网地址:https://doi.org/10.1016/j.knosys.2014.01.018