Derivative-free global minimization for a class of multiple minima problems

06/15/2020
by   Xiaopeng Luo, et al.
0

We prove that the finite-difference based derivative-free descent (FD-DFD) methods have a capability to find the global minima for a class of multiple minima problems. Our main result shows that, for a class of multiple minima objectives that is extended from strongly convex functions with Lipschitz-continuous gradients, the iterates of FD-DFD converge to the global minimizer x_* with the linear convergence x_k+1-x_*_2^2⩽ρ^k x_1-x_*_2^2 for a fixed 0<ρ<1 and any initial iteration x_1∈ℝ^d when the parameters are properly selected. Since the per-iteration cost, i.e., the number of function evaluations, is fixed and almost independent of the dimension d, the FD-DFD algorithm has a complexity bound 𝒪(log1/ϵ) for finding a point x such that the optimality gap x-x_*_2^2 is less than ϵ>0. Numerical experiments in various dimensions from 5 to 500 demonstrate the benefits of the FD-DFD method.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset