Backpropagation for Fully Connected Cascade Networks
作者:Yiping Cheng
摘要
The fully connected cascade (FCC) networks are a recently proposed class of neural networks where each layer has only one neuron and each neuron is connected with all the neurons in its previous layers. In this paper we derive and describe in detail an efficient backpropagation algorithm (named BPFCC) for computing the gradient for FCC networks. Actually, the backpropagation in BPFCC is an elaborately designed process for computing the derivative amplification coefficients, which are essential for gradient computation. The average time complexity for computing an entry of the gradient is O(1). BPFCC needs to be called by training algorithms to do any useful work, and we wrote a program FCCNET for that purpose. Currently, FCCNET uses the Levenberg–Marquardt algorithm to train FCC networks, and the loss function for classification is designed based on a nonlinear extension of logistic regression. For two-class classification, we derive a Gauss–Newton-like approximation for the Hessian of the loss function, and when the number of classes is more than two, numerical approximation of the Hessian is used. Experimental results confirm the efficiency of BPFCC, and the validity of the companion techniques.
论文关键词:Fully connected cascade networks, Backpropagation, Gradient computing, Neural network training
论文评审过程:
论文官网地址:https://doi.org/10.1007/s11063-017-9588-4