Robustness Metrics for Real-World Adversarial Examples

11/24/2019
by   Brett Jefferson, et al.
13

We explore metrics to evaluate the robustness of real-world adversarial attacks, in particular adversarial patches, to changes in environmental conditions. We demonstrate how these metrics can be used to establish model baseline performance and uncover model biases to then compare against real-world adversarial attacks. We establish a custom score for an adversarial condition that is adjusted for different environmental conditions and explore how the score changes with respect to specific environmental factors. Lastly, we propose two use cases for confidence distributions in each environmental condition.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset