论文标题

单图超分辨率的多尺度注意网络

Multi-scale Attention Network for Single Image Super-Resolution

论文作者

Wang, Yan, Li, Yusen, Wang, Gang, Liu, Xiaoguang

论文摘要

Convnets可以通过利用更大的接受场来与高级任务中的变形金刚竞争。为了在超分辨率中释放Convnet的潜力,我们提出了一个多尺度注意网络(MAN),通过将经典的多尺度机制与新兴的内核注意力耦合。特别是,我们提出了多尺度的大内核注意力(MLKA)和门控空间注意单元(GSAU)。通过我们的MLKA,我们通过多尺度和栅极方案修改了大量内核的注意,以在各种粒度水平上获得丰富的注意图,从而汇总了全球和局部信息,并避免了潜在的阻塞伪像。在GSAU中,我们整合了栅极机制和空间注意力,以消除不必要的线性层和汇总信息性空间环境。为了确认我们的设计的有效性,我们通过简单地堆叠不同数量的MLKA和GSAU来评估具有多种复杂性的人。实验结果表明,我们的男人可以在Swinir上表现出色,并在最先进的绩效和计算之间实现各种各样的权衡。

ConvNets can compete with transformers in high-level tasks by exploiting larger receptive fields. To unleash the potential of ConvNet in super-resolution, we propose a multi-scale attention network (MAN), by coupling classical multi-scale mechanism with emerging large kernel attention. In particular, we proposed multi-scale large kernel attention (MLKA) and gated spatial attention unit (GSAU). Through our MLKA, we modify large kernel attention with multi-scale and gate schemes to obtain the abundant attention map at various granularity levels, thereby aggregating global and local information and avoiding potential blocking artifacts. In GSAU, we integrate gate mechanism and spatial attention to remove the unnecessary linear layer and aggregate informative spatial context. To confirm the effectiveness of our designs, we evaluate MAN with multiple complexities by simply stacking different numbers of MLKA and GSAU. Experimental results illustrate that our MAN can perform on par with SwinIR and achieve varied trade-offs between state-of-the-art performance and computations.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源