论文标题

剥落蒙面的自动编码器有助于强大的分类

Denoising Masked AutoEncoders Help Robust Classification

论文作者

Wu, Quanlin, Ye, Hang, Gu, Yuntian, Zhang, Huishuai, Wang, Liwei, He, Di

论文摘要

在本文中,我们提出了一种新的自我监督方法,该方法称为Denoising蒙版自动编码器(DMAE),用于学习认证的稳健图像分类器。在DMAE中,我们通过在每个像素值中添加高斯声音并随机掩盖了几个补丁来损坏每个图像。然后,对基于变压器的编码器模型进行训练,以重建损坏的图像。在这个学习范式中,编码器将学会捕获下游任务的相关语义,这对于高斯添加剂的声音也很强大。我们表明,预先训练的编码器自然可以用作高斯平滑模型中的基本分类器,我们可以在其中分析任何数据点计算认证半径。尽管所提出的方法很简单,但它可以在下游分类任务中显着提高性能。我们表明,DMAE VIT-BASE模型仅使用了在最近的ARXIV:2206.10550中开发的模型的1/10参数,它在各种环境中实现了竞争性或更好的认证准确性。 DMAE VIT-LARGE模型显着超过了所有以前的结果,从而在ImageNet数据集上建立了新的最新结果。我们进一步证明,预训练的模型对CIFAR-10数据集具有良好的可传递性,这表明其适应性广泛。可在https://github.com/quanlin-wu/dmae上获得模型和代码。

In this paper, we propose a new self-supervised method, which is called Denoising Masked AutoEncoders (DMAE), for learning certified robust classifiers of images. In DMAE, we corrupt each image by adding Gaussian noises to each pixel value and randomly masking several patches. A Transformer-based encoder-decoder model is then trained to reconstruct the original image from the corrupted one. In this learning paradigm, the encoder will learn to capture relevant semantics for the downstream tasks, which is also robust to Gaussian additive noises. We show that the pre-trained encoder can naturally be used as the base classifier in Gaussian smoothed models, where we can analytically compute the certified radius for any data point. Although the proposed method is simple, it yields significant performance improvement in downstream classification tasks. We show that the DMAE ViT-Base model, which just uses 1/10 parameters of the model developed in recent work arXiv:2206.10550, achieves competitive or better certified accuracy in various settings. The DMAE ViT-Large model significantly surpasses all previous results, establishing a new state-of-the-art on ImageNet dataset. We further demonstrate that the pre-trained model has good transferability to the CIFAR-10 dataset, suggesting its wide adaptability. Models and code are available at https://github.com/quanlin-wu/dmae.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源