Lower Bound Methods and Separation Results for On-Line Learning Models

作者:Wolfgang Maass, György Turán

摘要

We consider the complexity of concept learning in various common models for on-line learning, focusing on methods for proving lower bounds to the learning complexity of a concept class. Among others, we consider the model for learning with equivalence and membership queries. For this model we give lower bounds on the number of queries that are needed to learn a concept class \(\mathcal{C}\) in terms of the Vapnik-Chervonenkis dimension of \(\mathcal{C}\), and in terms of the complexity of learning \(\mathcal{C}\) with arbitrary equivalence queries. Furthermore, we survey other known lower bound methods and we exhibit all known relationships between learning complexities in the models considered and some relevant combinatorial parameters. As it turns out, the picture is almost complete. This paper has been written so that it can be read without previous knowledge of Computational Learning Theory.

论文关键词:Formal models for learning, learning algorithms, lower bound arguments, VC-dimension, machine learning

论文评审过程:

论文官网地址:https://doi.org/10.1023/A:1022637031594