On the Sample Complexity for Nonoverlapping Neural Networks

作者:Michael Schmitt

摘要

A neural network is said to be nonoverlapping if there is at most one edge outgoing from each node. We investigate the number of examples that a learning algorithm needs when using nonoverlapping neural networks as hypotheses. We derive bounds for this sample complexity in terms of the Vapnik-Chervonenkis dimension. In particular, we consider networks consisting of threshold, sigmoidal and linear gates. We show that the class of nonoverlapping threshold networks and the class of nonoverlapping sigmoidal networks on n inputs both have Vapnik-Chervonenkis dimension Ω(nlog n). This bound is asymptotically tight for the class of nonoverlapping threshold networks. We also present an upper bound for this class where the constants involved are considerably smaller than in a previous calculation. Finally, we argue that the Vapnik-Chervonenkis dimension of nonoverlapping threshold or sigmoidal networks cannot become larger by allowing the nodes to compute linear functions. This sheds some light on a recent result that exhibited neural networks with quadratic Vapnik-Chervonenkis dimension.

论文关键词:neural networks, read-once formulas, threshold gates, sigmoidal gates, PAC learning, Vapnik-Chervonenkis dimension

论文评审过程:

论文官网地址:https://doi.org/10.1023/A:1007609822199