Dimension-aware attention for efficient mobile networks

作者:

Highlights:

• An attention mechanism (the DAA block) is developed for feature enhancement.

• Multi-branch factorization design enables low redundancy and high efficiency.

• The DAA block introduces a small computational cost with large receptive fields.

• The DAA block can be easily integrated with existing mobile networks.

• Experiments on multiple vision tasks show the effectiveness of our method.

摘要

•An attention mechanism (the DAA block) is developed for feature enhancement.•Multi-branch factorization design enables low redundancy and high efficiency.•The DAA block introduces a small computational cost with large receptive fields.•The DAA block can be easily integrated with existing mobile networks.•Experiments on multiple vision tasks show the effectiveness of our method.

论文关键词:Efficient mobile networks,Attention mechanism,Feature enhancement,Multi-branch factorization,Multi-dimensional information

论文评审过程:Received 6 May 2022, Revised 5 July 2022, Accepted 13 July 2022, Available online 16 July 2022, Version of Record 21 July 2022.

论文官网地址:https://doi.org/10.1016/j.patcog.2022.108899