Global exponential convergence and stability of gradient-based neural network for online matrix inversion

作者:

Highlights:

摘要

Wang proposed a gradient-based neural network (GNN) to solve online matrix-inverses. Global asymptotical convergence was shown for such a neural network when applied to inverting nonsingular matrices. As compared to the previously-presented asymptotical convergence, this paper investigates more desirable properties of the gradient-based neural network; e.g., global exponential convergence for nonsingular matrix inversion, and global stability even for the singular-matrix case. Illustrative simulation results further demonstrate the theoretical analysis of gradient-based neural network for online matrix inversion.

论文关键词:Gradient-based neural network,Online matrix inversion,Lyapunov stability theory,Asymptotical convergence,Global exponential convergence

论文评审过程:Available online 26 June 2009.

论文官网地址:https://doi.org/10.1016/j.amc.2009.06.048