G̅_mst:An Unbiased Stratified Statistic and a Fast Gradient Optimization Algorithm Based on It

10/07/2021
by   Aixiang Chen, et al.
0

-The fluctuation effect of gradient expectation and variance caused by parameter update between consecutive iterations is neglected or confusing by current mainstream gradient optimization algorithms. The work in this paper remedy this issue by introducing a novel unbiased stratified statistic G̅_mst , a sufficient condition of fast convergence for G̅_mst also is established. A novel algorithm named MSSG designed based on G̅_mst outperforms other sgd-like algorithms. Theoretical conclusions and experimental evidence strongly suggest to employ MSSG when training deep model.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset