Polynomial Bounds for VC Dimension of Sigmoidal and General Pfaffian Neural Networks

作者:

Highlights:

摘要

We introduce a new method for proving explicit upper bounds on the VC dimension of general functional basis networks and prove as an application, for the first time, that the VC dimension of analog neural networks with the sigmoidal activation functionσ(y)=1/1+e−yis bounded by a quadratic polynomialO((lm)2) in both the numberlof programmable parameters, and the numbermof nodes. The proof method of this paper generalizes to much wider class of Pfaffian activation functions and formulas and gives also for the first time polynomial bounds on their VC dimension. We present also some other applications of our method.

论文关键词:

论文评审过程:Received 10 May 1995, Revised 25 October 1995, Available online 25 May 2002.

论文官网地址:https://doi.org/10.1006/jcss.1997.1477