Enhancement of neural networks with an alternative activation function tanhLU
作者:
Highlights:
• A novel activation function (tanhLU) is proposed to enhance neural networks.
• tanhLU integrates hyperbolic tangent function with linear unit.
• Improvement of tanhLU is analyzed by weight gradient check and experiments.
• tanhLU outperforms tanh and its variants in multiple neural networks and datasets.
摘要
•A novel activation function (tanhLU) is proposed to enhance neural networks.•tanhLU integrates hyperbolic tangent function with linear unit.•Improvement of tanhLU is analyzed by weight gradient check and experiments.•tanhLU outperforms tanh and its variants in multiple neural networks and datasets.
论文关键词:Neural networks,Activation function,tanhLUs
论文评审过程:Received 6 July 2021, Revised 21 December 2021, Accepted 1 April 2022, Available online 8 April 2022, Version of Record 9 April 2022.
论文官网地址:https://doi.org/10.1016/j.eswa.2022.117181