Hierarchical accumulation network with grid attention for image super-resolution
作者:
Highlights:
•
摘要
Deep convolutional neural networks (CNNs) have recently shown promising results in single image super-resolution (SISR) due to their powerful representation ability. However, existing CNN-based SR methods mainly focus on deeper architecture design to obtain high-level semantic information, neglecting the features of intermediate layers containing fine-grained texture information and thus limiting the capacity for producing precise high-resolution images. To tackle this issue, we propose a hierarchical accumulation network (HAN) with grid attention in this paper. Specifically, a hierarchical feature accumulation (HFA) structure is proposed to accumulate outputs of intermediate layers in a grouping manner for exploiting the features of different semantic levels. Moreover, we introduce a multi-scale grid attention module (MGAM) to refine features of the same level. The MGAM employs a pyramid sampling with self-attention mechanism to efficiently model the non-local dependencies between pixel features and produces refined representations. By this means, the universal features in connection with spatial similarity and semantic levels are produced for image SR. Experimental results on five benchmark datasets with different degradation models demonstrate the superiority of our HAN in terms of quantitative metrics and visual quality.
论文关键词:Image super-resolution,Grouping,Attention mechanism,Accumulation network
论文评审过程:Received 24 May 2021, Revised 21 August 2021, Accepted 18 September 2021, Available online 30 September 2021, Version of Record 4 October 2021.
论文官网地址:https://doi.org/10.1016/j.knosys.2021.107520