A Simple and Light-Weight Attention Module for Convolutional Neural Networks

作者:Jongchan Park, Sanghyun Woo, Joon-Young Lee, In So Kweon

摘要

Many aspects of deep neural networks, such as depth, width, or cardinality, have been studied to strengthen the representational power. In this work, we study the effect of attention in convolutional neural networks and present our idea in a simple self-contained module, called Bottleneck Attention Module (BAM). Given an intermediate feature map, BAM efficiently produces the attention map along two factorized axes, channel and spatial, with negligible overheads. BAM is placed at bottlenecks of various models where the downsampling of feature maps occurs, and is jointly trained in an end-to-end manner. Ablation studies and extensive experiments are conducted in CIFAR-100/ImageNet classification, VOC2007/MS-COCO detection, super resolution and scene parsing with various architectures including mobile-oriented networks. BAM shows consistent improvements over all experiments, demonstrating the wide applicability of BAM. The code and models are available at https://github.com/Jongchan/attentionmodule.

论文关键词:Attention mechanism, Deep learning, Convolutional Neural Networks, Image Recognition, Self-attention

论文评审过程:

论文官网地址:https://doi.org/10.1007/s11263-019-01283-0