Support Vector Machine incorporated with feature discrimination
作者:
Highlights:
•
摘要
Support Vector Machine (SVM) achieves state-of-the-art performance in many real applications. A guarantee of its performance superiority is from the maximization of between-class margin, or loosely speaking, full use of discriminative information from between-class samples. While in this paper, we focus on not only such discriminative information from samples but also discrimination of individual features and develop feature discrimination incorporated SVM (FDSVM). Instead of minimizing the l2-norm of feature weight vector, or equivalently, imposing equal penalization on all weight components in SVM learning, FDSVM penalizes each weight by an amount decreasing with the corresponding feature discrimination measure, consequently features with better discrimination can be attached greater importance. Experiments on both toy and real UCI datasets demonstrate that FDSVM often achieves better performance with comparable efficiency.
论文关键词:Weight vector,Feature (attribute) discrimination,Weight penalization matrix,Support Vector Machine,Pattern classification
论文评审过程:Available online 19 April 2011.
论文官网地址:https://doi.org/10.1016/j.eswa.2011.04.034