Fast Learning Algorithms for Feedforward Neural Networks

作者:Minghu Jiang, Georges Gielen, Bo Zhang, Zhensheng Luo

摘要

In order to improve the training speed of multilayer feedforward neural networks (MLFNN), we propose and explore two new fast backpropagation (BP) algorithms obtained: (1) by changing the error functions, in case using the exponent attenuation (or bell impulse) function and the Fourier kernel function as alternative functions; and (2) by introducing the hybrid conjugate-gradient algorithm of global optimization for dynamic learning rate to overcome the conventional BP learning problems of getting stuck into local minima or slow convergence. Our experimental results demonstrate the effectiveness of the modified error functions since the training speed is faster than that of existing fast methods. In addition, our hybrid algorithm has a higher recognition rate than the Polak-Ribieve conjugate gradient and conventional BP algorithms, and has less training time, less complication and stronger robustness than the Fletcher-Reeves conjugate-gradient and conventional BP algorithms for real speech data.

论文关键词:fast algorithm, error function, conjugate gradient, global convergence, feedforward neural networks

论文评审过程:

论文官网地址:https://doi.org/10.1023/A:1020922701312