Approximate logic neuron model trained by states of matter search algorithm
作者:
Highlights:
•
摘要
An approximate logic neuron model (ALNM) is a single neural model with a dynamic dendritic structure. During the training process, the model is capable of reducing useless synapses and unnecessary branches of dendrites by neural pruning function. It provides a simplified dendritic morphology for each particular problem. Then, the simplified model of ALNM can be substituted with a logic circuit, which is easy to implement on hardware. However, the computational capacity of this model has been greatly restricted by its learning algorithm, the back-propagation (BP) algorithm, because it is sensitive to initial values and easy to be trapped into local minima. To address this critical issue, we have investigated the capabilities of heuristic optimization methods that are acknowledged as global searching algorithms. Through comparison experiments, a states of matter search (SMS) algorithm has been verified to be the most suitable training method for ALNM. To evaluate the performance of SMS, six benchmark datasets are utilized in the experiments. The corresponding results are compared with the BP algorithm, other optimization methods, and several widely used classifiers. In addition, the classification performances of logic circuits trained by SMS are also presented in this study.
论文关键词:Classification,States of matter search,Neural network,Pruning,Logic circuit
论文评审过程:Received 14 January 2018, Revised 14 August 2018, Accepted 16 August 2018, Available online 20 August 2018, Version of Record 21 November 2018.
论文官网地址:https://doi.org/10.1016/j.knosys.2018.08.020