论文标题

从粗糙到细小:轻质图像超分辨率的分层像素集成

From Coarse to Fine: Hierarchical Pixel Integration for Lightweight Image Super-Resolution

论文作者

Liu, Jie, Chen, Chao, Tang, Jie, Wu, Gangshan

论文摘要

图像超分辨率(SR)是多媒体数据处理和传输的基本工具。最近,基于变压器的模型已在图像SR中实现了竞争性能。他们将图像分为固定尺寸的斑块,并在这些斑块上应用自我注意,以模拟像素之间的远程依赖性。但是,这种体系结构设计起源于高级视觉任务,该任务缺乏SR知识的设计指南。在本文中,我们旨在设计一个新的关注块,其洞察力来自对SR网络的本地归因地图(LAM)的解释。具体而言,LAM提出了一个层次重要性图,其中最重要的像素位于贴片的细微区域,并且一些不太重要的像素分布在整个图像的粗糙区域中。为了访问粗区域中的像素,我们不使用非常大的补丁大小,而是提出了一个轻巧的全局像素访问(GPA)模块,该模块适用于图像中最类似的贴片。在罚款区域中,我们使用斑点内的自我注意事项(IPSA)模块来对本地补丁中的远程像素依赖性进行建模,然后应用$ 3 \ times3 $卷积来处理最好的细节。此外,提出了级联的斑块分区(CPD)策略,以提高恢复图像的感知质量。广泛的实验表明,我们的方法的表现优于最先进的轻量级SR方法。代码可在https://github.com/passerer/hpinet上找到。

Image super-resolution (SR) serves as a fundamental tool for the processing and transmission of multimedia data. Recently, Transformer-based models have achieved competitive performances in image SR. They divide images into fixed-size patches and apply self-attention on these patches to model long-range dependencies among pixels. However, this architecture design is originated for high-level vision tasks, which lacks design guideline from SR knowledge. In this paper, we aim to design a new attention block whose insights are from the interpretation of Local Attribution Map (LAM) for SR networks. Specifically, LAM presents a hierarchical importance map where the most important pixels are located in a fine area of a patch and some less important pixels are spread in a coarse area of the whole image. To access pixels in the coarse area, instead of using a very large patch size, we propose a lightweight Global Pixel Access (GPA) module that applies cross-attention with the most similar patch in an image. In the fine area, we use an Intra-Patch Self-Attention (IPSA) module to model long-range pixel dependencies in a local patch, and then a $3\times3$ convolution is applied to process the finest details. In addition, a Cascaded Patch Division (CPD) strategy is proposed to enhance perceptual quality of recovered images. Extensive experiments suggest that our method outperforms state-of-the-art lightweight SR methods by a large margin. Code is available at https://github.com/passerer/HPINet.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源