Complexity Results on Learning by Neural Nets
作者:Jyh-Han Lin, Jeffrey Scott Vitter
摘要
We consider the computational complexity of learning by neural nets. We are interested in how hard it is to design appropriate neural net architectures and to train neural nets for general and specialized learning tasks. Our main result shows that the training problem for 2-cascade neural nets (which have only two non-input nodes, one of which is hidden) is \(\Re P\)-complete, which implies that finding an optimal net (in terms of the number of non-input units) that is consistent with a set of examples is also \(\Re P\)-complete. This result also demonstrates a surprising gap between the computational complexities of one-node (perceptron) and two-node neural net training problems, since the perceptron training problem can be solved in polynomial time by linear programming techniques. We conjecture that training a k-cascade neural net, which is a classical threshold network training problem, is also \(\Re P\)-complete, for each fixed k ≥ 2. We also show that the problem of finding an optimal perceptron (in terms of the number of non-zero weights) consistent with a set of training examples is \(\Re P\)-hard.
论文关键词:Neural nets, perceptrons, cascade neural nets, scaling, modular neural nets, learning from examples, concept learning, theoretical limitation
论文评审过程:
论文官网地址:https://doi.org/10.1023/A:1022657626762