论文标题
关于计算机视觉中身体对抗攻击的调查
A Survey on Physical Adversarial Attack in Computer Vision
论文作者
论文摘要
在过去的十年中,深度学习彻底改变了传统任务,这些任务依靠手工艺特征提取具有强大的功能学习能力,从而导致了传统任务的实质性增强。但是,已经证明深度神经网络(DNN)容易受到恶意小噪声制作的对抗例子的影响,这对人类观察者来说是无法察觉的,但可以使DNNS输出错误的结果。现有的对抗攻击可以归类为数字和物理对抗攻击。前者旨在在实验室环境中追求强大的攻击性能,同时在应用于物理世界时几乎没有效果。相反,后者的重点是开发物理可部署的攻击,因此在复杂的物理环境条件下表现出更大的鲁棒性。最近,随着基于DNN的系统在现实世界中的部署越来越多,加强这些系统的鲁棒性是紧急情况,而详尽地探索身体对抗性攻击是前提。为此,本文回顾了针对基于DNN的计算机视觉任务的身体对抗性攻击的演变,期望为开发更强大的身体对抗性攻击提供有益的信息。具体而言,我们首先提出了一种分类法,以对当前的身体对抗攻击进行分类并将其分组。然后,我们讨论现有的物理攻击,并专注于在复杂的物理环境条件下改善物理攻击的鲁棒性的技术。最后,我们讨论了要解决的当前身体对抗攻击的问题,并给出了有希望的方向。
Over the past decade, deep learning has revolutionized conventional tasks that rely on hand-craft feature extraction with its strong feature learning capability, leading to substantial enhancements in traditional tasks. However, deep neural networks (DNNs) have been demonstrated to be vulnerable to adversarial examples crafted by malicious tiny noise, which is imperceptible to human observers but can make DNNs output the wrong result. Existing adversarial attacks can be categorized into digital and physical adversarial attacks. The former is designed to pursue strong attack performance in lab environments while hardly remaining effective when applied to the physical world. In contrast, the latter focus on developing physical deployable attacks, thus exhibiting more robustness in complex physical environmental conditions. Recently, with the increasing deployment of the DNN-based system in the real world, strengthening the robustness of these systems is an emergency, while exploring physical adversarial attacks exhaustively is the precondition. To this end, this paper reviews the evolution of physical adversarial attacks against DNN-based computer vision tasks, expecting to provide beneficial information for developing stronger physical adversarial attacks. Specifically, we first proposed a taxonomy to categorize the current physical adversarial attacks and grouped them. Then, we discuss the existing physical attacks and focus on the technique for improving the robustness of physical attacks under complex physical environmental conditions. Finally, we discuss the issues of the current physical adversarial attacks to be solved and give promising directions.