Optimization without Backpropagation

09/13/2022
by   Gabriel Belouze, et al.
0

Forward gradients have been recently introduced to bypass backpropagation in autodifferentiation, while retaining unbiased estimators of true gradients. We derive an optimality condition to obtain best approximating forward gradients, which leads us to mathematical insights that suggest optimization in high dimension is challenging with forward gradients. Our extensive experiments on test functions support this claim.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset