Generalization bounds for learning with linear, polygonal, quadratic and conic side knowledge

作者:Theja Tulabandhula, Cynthia Rudin

摘要

In this paper, we consider a supervised learning setting where side knowledge is provided about the labels of unlabeled examples. The side knowledge has the effect of reducing the hypothesis space, leading to tighter generalization bounds, and thus possibly better generalization. We consider several types of side knowledge, the first leading to linear and polygonal constraints on the hypothesis space, the second leading to quadratic constraints, and the last leading to conic constraints. We show how different types of domain knowledge can lead directly to these kinds of side knowledge. We prove bounds on complexity measures of the hypothesis space for quadratic and conic side knowledge, and show that these bounds are tight in a specific sense for the quadratic case.

论文关键词:Statistical learning theory, Generalization bounds , Rademacher complexity, Covering numbers, constrained linear function classes, Side knowledge

论文评审过程:

论文官网地址:https://doi.org/10.1007/s10994-014-5478-4