DAP: A Dynamic Adversarial Patch for Evading Person Detectors

by   Amira Guesmi, et al.

In this paper, we present a novel approach for generating naturalistic adversarial patches without using GANs. Our proposed approach generates a Dynamic Adversarial Patch (DAP) that looks naturalistic while maintaining high attack efficiency and robustness in real-world scenarios. To achieve this, we redefine the optimization problem by introducing a new objective function, where a similarity metric is used to construct a similarity loss. This guides the patch to follow predefined patterns while maximizing the victim model's loss function. Our technique is based on directly modifying the pixel values in the patch which gives higher flexibility and larger space to incorporate multiple transformations compared to the GAN-based techniques. Furthermore, most clothing-based physical attacks assume static objects and ignore the possible transformations caused by non-rigid deformation due to changes in a person's pose. To address this limitation, we incorporate a “Creases Transformation” (CT) block, i.e., a preprocessing block following an Expectation Over Transformation (EOT) block used to generate a large variation of transformed patches incorporated in the training process to increase its robustness to different possible real-world distortions (e.g., creases in the clothing, rotation, re-scaling, random noise, brightness and contrast variations, etc.). We demonstrate that the presence of different real-world variations in clothing and object poses (i.e., above-mentioned distortions) lead to a drop in the performance of state-of-the-art attacks. For instance, these techniques can merely achieve 20% in the physical world and 30.8% in the digital world while our attack provides superior success rate of up to 65% and 84.56%, respectively when attacking the YOLOv3tiny detector deployed in smart cameras at the edge.


page 8

page 9


AdvART: Adversarial Art for Camouflaged Object Detection Attacks

A majority of existing physical attacks in the real world result in cons...

3D Invisible Cloak

In this paper, we propose a novel physical stealth attack against the pe...

Fooling automated surveillance cameras: adversarial patches to attack person detection

Adversarial attacks on machine learning models have seen increasing inte...

Enhancing Real-World Adversarial Patches with 3D Modeling Techniques

Although many studies have examined adversarial examples in the real wor...

Jacks of All Trades, Masters Of None: Addressing Distributional Shift and Obtrusiveness via Transparent Patch Attacks

We focus on the development of effective adversarial patch attacks and –...

Unified Adversarial Patch for Visible-Infrared Cross-modal Attacks in the Physical World

Physical adversarial attacks have put a severe threat to DNN-based objec...

Patch Attack Invariance: How Sensitive are Patch Attacks to 3D Pose?

Perturbation-based attacks, while not physically realizable, have been t...

Please sign up or login with your details

Forgot password? Click here to reset