Mini-Batch Stochastic ADMMs for Nonconvex Nonsmooth Optimization

02/08/2018
by   Feihu Huang, et al.
0

In the paper, we study the mini-batch stochastic ADMMs (alternating direction method of multipliers) for the nonconvex nonsmooth optimization. We prove that, given an appropriate mini-batch size, the mini-batch stochastic ADMM without variance reduction (VR) technique is convergent and reaches the convergence rate of O(1/T) to obtain a stationary point of the nonconvex optimization, where T denotes the number of iterations. Moreover, we extend the mini-batch stochastic gradient method to both the nonconvex SVRG-ADMM and SAGA-ADMM in our initial paper huang2016stochastic, and also prove that these mini-batch stochastic ADMMs reach the convergence rate of O(1/T) without the condition on the mini-batch size. In particular, we provide a specific parameter selection for step size η of stochastic gradients and penalization parameter ρ of the augmented Lagrangian function. Finally, some experimental results demonstrate the effectiveness of our algorithms.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset