Accelerating Adversarial Perturbation by 50 Propagation

11/09/2022
by   Zhiqi Bu, et al.
0

Adversarial perturbation plays a significant role in the field of adversarial robustness, which solves a maximization problem over the input data. We show that the backward propagation of such optimization can accelerate 2× (and thus the overall optimization including the forward propagation can accelerate 1.5×), without any utility drop, if we only compute the output gradient but not the parameter gradient during the backward propagation.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset