Simultaneous optimization of neural network function and architecture algorithm

作者:

Highlights:

摘要

A major limitation to current artificial neural network (NN) research is the inability to adequately identify unnecessary weights in the solution. If a method were found that would allow unnecessary weights to be identified, decision-makers would gain crucial information about the problem at hand as well as benefit by having a network that was more effective and efficient. The Neural Network Simultaneous Optimization Algorithm (NNSOA) is proposed for supervised training in multilayer feedforward neural networks. We demonstrate with Monte Carlo studies that the NNSOA can be used to obtain both a global solution and simultaneously identify a parsimonious network structure.

论文关键词:Artificial intelligence,Backpropagation,Genetic algorithm,Neural networks,Parsimonious

论文评审过程:Received 1 January 2002, Revised 1 April 2002, Accepted 5 August 2002, Available online 28 October 2002.

论文官网地址:https://doi.org/10.1016/S0167-9236(02)00147-1