Model Selection for Small Sample Regression

作者:Olivier Chapelle, Vladimir Vapnik, Yoshua Bengio

摘要

Model selection is an important ingredient of many machine learning algorithms, in particular when the sample size in small, in order to strike the right trade-off between overfitting and underfitting. Previous classical results for linear regression are based on an asymptotic analysis. We present a new penalization method for performing model selection for regression that is appropriate even for small samples. Our penalization is based on an accurate estimator of the ratio of the expected training error and the expected generalization error, in terms of the expected eigenvalues of the input covariance matrix.

论文关键词:model selection, parametric regression, uniform convergence bounds

论文评审过程:

论文官网地址:https://doi.org/10.1023/A:1013943418833