Calibrating artificial neural networks by global optimization

作者:

Highlights:

摘要

Artificial neural networks (ANNs) are used extensively to model unknown or unspecified functional relationships between the input and output of a “black box” system. In order to apply the generic ANN concept to actual system model fitting problems, a key requirement is the training of the chosen (postulated) ANN structure. Such training serves to select the ANN parameters in order to minimize the discrepancy between modeled system output and the training set of observations. We consider the parameterization of ANNs as a potentially multi-modal optimization problem, and then introduce a corresponding global optimization (GO) framework. The practical viability of the GO based ANN training approach is illustrated by finding close numerical approximations of one-dimensional, yet visibly challenging functions. For this purpose, we have implemented a flexible ANN framework and an easily expandable set of test functions in the technical computing system Mathematica. The MathOptimizer Professional global–local optimization software has been used to solve the induced (multi-dimensional) ANN calibration problems.

论文关键词:Artificial neural networks,Calibration of ANNs by global optimization,ANN implementation in Mathematica,Lipschitz Global Optimizer (LGO) solver suite,MathOptimizer Professional (LGO linked to Mathematica),Numerical examples

论文评审过程:Available online 12 July 2011.

论文官网地址:https://doi.org/10.1016/j.eswa.2011.06.050