论文标题

对抗性的伴侣:对DNNS的有效,隐秘和强大的物理世界攻击

Adversarial Catoptric Light: An Effective, Stealthy and Robust Physical-World Attack to DNNs

论文作者

Hu, Chengyin, Shi, Weiwen

论文摘要

深度神经网络(DNN)在各种任务中都表现出了出色的成功,强调了评估晚期DNN稳健性的需求。但是,使用贴纸作为物理扰动来欺骗分类器的传统方法在实现隐形和打印损失方面面临挑战。物理攻击的最新进展利用了激光和投影仪等光束来执行攻击,其中生成的光学模式是人造的,而不是自然的。在这项研究中,我们介绍了一种新型的物理攻击,对抗性曲线光(ADVCL),其中使用常见的自然现象,即cat曲,以实现对黑盒环境中的先进DNN的侵袭,以实现对抗高级DNN的逆向扰动。我们在三个方面评估了所提出的方法:有效性,隐身性和鲁棒性。在模拟环境中获得的定量结果证明了该方法的有效性,在物理情况下,我们达到了83.5%的攻击成功率,超过了基线。我们使用常见的曲目作为扰动来增强该方法的隐身性,并使物理样品看起来更自然。在所有情况下,成功攻击高级和鲁棒的DNN,成功率超过80%,可以通过成功攻击高级和鲁棒的DNN来验证鲁棒性。此外,我们讨论了针对Advcl的防御策略,并提出了一些基于轻的物理攻击。

Deep neural networks (DNNs) have demonstrated exceptional success across various tasks, underscoring the need to evaluate the robustness of advanced DNNs. However, traditional methods using stickers as physical perturbations to deceive classifiers present challenges in achieving stealthiness and suffer from printing loss. Recent advancements in physical attacks have utilized light beams such as lasers and projectors to perform attacks, where the optical patterns generated are artificial rather than natural. In this study, we introduce a novel physical attack, adversarial catoptric light (AdvCL), where adversarial perturbations are generated using a common natural phenomenon, catoptric light, to achieve stealthy and naturalistic adversarial attacks against advanced DNNs in a black-box setting. We evaluate the proposed method in three aspects: effectiveness, stealthiness, and robustness. Quantitative results obtained in simulated environments demonstrate the effectiveness of the proposed method, and in physical scenarios, we achieve an attack success rate of 83.5%, surpassing the baseline. We use common catoptric light as a perturbation to enhance the stealthiness of the method and make physical samples appear more natural. Robustness is validated by successfully attacking advanced and robust DNNs with a success rate over 80% in all cases. Additionally, we discuss defense strategy against AdvCL and put forward some light-based physical attacks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源