Recurrent Multiplicative Neuron Model Artificial Neural Network for Non-linear Time Series Forecasting
作者:Erol Egrioglu, Ufuk Yolcu, Cagdas Hakan Aladag, Eren Bas
摘要
Artificial neural networks (ANN) have been widely used in recent years to model non-linear time series since ANN approach is a responsive method and does not require some assumptions such as normality or linearity. An important problem with using ANN for time series forecasting is to determine the number of neurons in hidden layer. There have been some approaches in the literature to deal with the problem of determining the number of neurons in hidden layer. A new ANN model was suggested which is called multiplicative neuron model (MNM) in the literature. MNM has only one neuron in hidden layer. Therefore, the problem of determining the number of neurons in hidden layer is automatically solved when MNM is employed. Also, MNM can produce accurate forecasts for non-linear time series. ANN models utilized for non-linear time series have generally autoregressive structures since lagged variables of time series are generally inputs of these models. On the other hand, it is a well-known fact that better forecasts for real life time series can be obtained from models whose inputs are lagged variables of error. In this study, a new recurrent multiplicative neuron neural network model is firstly proposed. In the proposed method, lagged variables of error are included in the model. Also, the problem of determining the number of neurons in hidden layer is avoided when the proposed method is used. To train the proposed neural network model, particle swarm optimization algorithm was used. To evaluate the performance of the proposed model, it was applied to a real life time series. Then, results produced by the proposed method were compared to those obtained from other methods. It was observed that the proposed method has superior performance to existing methods.
论文关键词:Artificial neural networks, Forecasting, Multiplicative neuron model, Non-linear time series, Recurrent neural networks
论文评审过程:
论文官网地址:https://doi.org/10.1007/s11063-014-9342-0