| |

Adversarial Sticker: A Stealthy Attack Method in the Physical World.

Researchers

Journal

Modalities

Models

Abstract

To assess the vulnerability of deep learning in physical world, recent works introduce adversarial patch and apply it on different tasks. In this paper, we propose another kind of adversarial patch: Meaningful Adversarial Sticker, a physically feasible and stealthy attack method by using real stickers existing in our life. Instead of previous adversarial patches by designing perturbations, our method manipulates the sticker’s pasting position, rotation angle on the objects to perform physical attacks. Because the position and rotation angle are less affected by the printing loss and color distortion, adversarial stickers can keep good attacking performance in physical world. Besides, to make adversarial stickers more practical in real scenes, we conduct attacks in the black-box setting with limited information rather than the white-box setting with all the details of threat models. To effectively solve the sticker’s parameters, we design Region based Heuristic Differential Algorithm, which utilizes the new-found regional aggregation of effective solutions and the adaptive adjustment strategy of evaluation criteria. Our method is comprehensively verified in Face Recognition and then extended to Image Retrieval and Traffic Sign Recognition. Extensive experiments show the proposed method is effective and efficient in complex physical conditions and has good generalization for different tasks.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *