Two Novel Versions of Randomized Feed Forward Artificial Neural Networks: Stochastic and Pruned Stochastic
作者:Ömer Faruk Ertuğrul
摘要
Although high accuracies were achieved by artificial neural network (ANN), determining the optimal number of neurons in the hidden layer and the activation function is still an open issue. In this paper, the applicability of assigning the number of neurons in the hidden layer and the activation function randomly was investigated. Based on the findings, two novel versions of randomized ANNs, which are stochastic, and pruned stochastic, were proposed to achieve a higher accuracy without any time-consuming optimization stage. The proposed approaches were evaluated and validated by the basic versions of the popular randomized ANNs [1] are the random weight neural network [2], the random vector functional links [3] and the extreme learning machine [4] methods. In the stochastic version of randomized ANNs, not only the weights and biases of the neurons in the hidden layer but also the number of neurons in the hidden layer and each activation function were assigned randomly. In pruned stochastic version of these methods, the winner networks were pruned according to a novel strategy in order to produce a faster response. Proposed approaches were validated via 60 datasets (30 classification and 30 regression datasets). Obtained accuracies and time usages showed that both versions of randomized ANNs can be employed for classification and regression.
论文关键词:Randomized weight neural network, Random vector functional link, Extreme learning machines, Stochastic, Pruned stochastic, Random network structure, Random activation function
论文评审过程:
论文官网地址:https://doi.org/10.1007/s11063-017-9752-x