A Winner-Take-All Neural Networks of N Linear Threshold Neurons without Self-Excitatory Connections

作者:Hong Qu, Zhang Yi, Xiaobin Wang

摘要

Multistable neural networks have attracted much interests in recent years, since the monostable networks are computationally restricted. This paper studies a N linear threshold neurons recurrent networks without Self-Excitatory connections. Our studies show that this network performs a Winner-Take-All (WTA) behavior, which has been recognized as a basic computational model done in brain. The contributions of this paper are: (1) It proves by mathematics that the proposed model is Non-Divergent. (2) An important implication (Winner-Take-All) of the proposed network model is studied. (3) Digital computer simulations are carried out to validate the performance of the theory findings.

论文关键词:Winner-Take-All, Recurrent neural networks, Multistable, Linear threshold neurons

论文评审过程:

论文官网地址:https://doi.org/10.1007/s11063-009-9100-x