论文标题

自我监督的快速改编,以通过元学习降级

Self-Supervised Fast Adaptation for Denoising via Meta-Learning

论文作者

Lee, Seunghwan, Cho, Donghyeon, Kim, Jiwon, Kim, Tae Hyun

论文摘要

在噪声的某些统计假设下,已经引入了最近的自我监督方法来学习网络参数而没有真正的干净图像,这些方法可以通过在测试时间从给定输入(即内部统计信息)中利用可用信息来恢复图像。但是,自我监督的方法尚未与常规监督的deoising方法相结合,这些方法训练了denoising网络,并使用大量的外部训练样本进行训练。因此,我们提出了一种新的DeNoising方法,可以通过将其网络参数调整为通过自uppervision而无需更改网络体系结构而将其网络参数调整到给定的输入中,从而极大地胜过最先进的监督denoisising方法。此外,我们提出了一种元学习算法,以在测试时快速适应参数对特定输入。我们证明,所提出的方法可以轻松地使用最新的DeNoising网络使用,而无需其他参数,并在众多基准数据集中实现最先进的性能。

Under certain statistical assumptions of noise, recent self-supervised approaches for denoising have been introduced to learn network parameters without true clean images, and these methods can restore an image by exploiting information available from the given input (i.e., internal statistics) at test time. However, self-supervised methods are not yet combined with conventional supervised denoising methods which train the denoising networks with a large number of external training samples. Thus, we propose a new denoising approach that can greatly outperform the state-of-the-art supervised denoising methods by adapting their network parameters to the given input through selfsupervision without changing the networks architectures. Moreover, we propose a meta-learning algorithm to enable quick adaptation of parameters to the specific input at test time. We demonstrate that the proposed method can be easily employed with state-of-the-art denoising networks without additional parameters, and achieve state-of-the-art performance on numerous benchmark datasets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源