Rates of approximation by neural network interpolation operators

作者:

Highlights:

• In the present paper, we introduce a general class of activation functions that are linear combinations of general sigmoidal functions. By using the new introduced activation functions, we construct a single layer neural network interpolation operator (Eq. (1.9)). The following properties of the operator are investigated: the rate of approximation by the operator for continuous functions; some important inequalities of the derivative of the operator; the converse result of approximation; a special combinations of the operators, which can approximate the object function and its derivative simultaneously; both the direct and converse theorem of approximation by Kantorovich variant of the operator. We also give some examples of the newly defined activation functions. Finally, we give some numerical examples to demonstrate the validity of the obtained results.

摘要

•In the present paper, we introduce a general class of activation functions that are linear combinations of general sigmoidal functions. By using the new introduced activation functions, we construct a single layer neural network interpolation operator (Eq. (1.9)). The following properties of the operator are investigated: the rate of approximation by the operator for continuous functions; some important inequalities of the derivative of the operator; the converse result of approximation; a special combinations of the operators, which can approximate the object function and its derivative simultaneously; both the direct and converse theorem of approximation by Kantorovich variant of the operator. We also give some examples of the newly defined activation functions. Finally, we give some numerical examples to demonstrate the validity of the obtained results.

论文关键词:Sigmoidal function,Neural network operators,Interpolation,Uniform approximate,Simultaneous approximation

论文评审过程:Received 19 May 2021, Revised 26 October 2021, Accepted 31 October 2021, Available online 6 December 2021, Version of Record 6 December 2021.

论文官网地址:https://doi.org/10.1016/j.amc.2021.126781