Statistical Properties and Adaptive Tuning of Support Vector Machines

作者:Yi Lin, Grace Wahba, Hao Zhang, Yoonkyung Lee

摘要

In this paper we consider the statistical aspects of support vector machines (SVMs) in the classification context, and describe an approach to adaptively tuning the smoothing parameter(s) in the SVMs. The relation between the Bayes rule of classification and the SVMs is discussed, shedding light on why the SVMs work well. This relation also reveals that the misclassification rate of the SVMs is closely related to the generalized comparative Kullback-Leibler distance (GCKL) proposed in Wahba (1999, Scholkopf, Burges, & Smola (Eds.), Advances in Kernel Methods—Support Vector Learning. Cambridge, MA: MIT Press). The adaptive tuning is based on the generalized approximate cross validation (GACV), which is an easily computable proxy of the GCKL. The results are generalized to the unbalanced case where the fraction of members of the classes in the training set is different than that in the general population, and the costs of misclassification for the two kinds of errors are different. The main results in this paper have been obtained in several places elsewhere. Here we take the opportunity to organize them in one place and note how they fit together and reinforce one another. Mostly the work of the authors is reviewed.

论文关键词:support vector machine, classification, Bayes rule, GCKL , GACV

论文评审过程:

论文官网地址:https://doi.org/10.1023/A:1013951620650