A generic shift-norm-activation approach for deep learning
作者:
Highlights:
• We present a shift-norm-activation framework, which integrates both normalization and activation into a single module, by filtering out signals with the found optimal threshold at both global scale and local scale.
• A rigorous mathematical analysis is performed on the understanding of normalization and activation on computation burden, comparisons to existing methods, its theoretical performance potential, and a reparameterization trick to improve optimization.
• The extensively conducted experiments demonstrate the potential of the proposed framework in various computer vision benchmarking tasks.
摘要
•We present a shift-norm-activation framework, which integrates both normalization and activation into a single module, by filtering out signals with the found optimal threshold at both global scale and local scale.•A rigorous mathematical analysis is performed on the understanding of normalization and activation on computation burden, comparisons to existing methods, its theoretical performance potential, and a reparameterization trick to improve optimization.•The extensively conducted experiments demonstrate the potential of the proposed framework in various computer vision benchmarking tasks.
论文关键词:Activation,Normalization,CNN,Shifting,Deep learning
论文评审过程:Received 8 March 2020, Revised 19 July 2020, Accepted 19 August 2020, Available online 22 August 2020, Version of Record 24 August 2020.
论文官网地址:https://doi.org/10.1016/j.patcog.2020.107609