Robustness of Quantum Algorithms for Nonconvex Optimization

by   Weiyuan Gong, et al.

Recent results suggest that quantum computers possess the potential to speed up nonconvex optimization problems. However, a crucial factor for the implementation of quantum optimization algorithms is their robustness against experimental and statistical noises. In this paper, we systematically study quantum algorithms for finding an ϵ-approximate second-order stationary point (ϵ-SOSP) of a d-dimensional nonconvex function, a fundamental problem in nonconvex optimization, with noisy zeroth- or first-order oracles as inputs. We first prove that, up to noise of O(ϵ^10/d^5), accelerated perturbed gradient descent with quantum gradient estimation takes O(log d/ϵ^1.75) quantum queries to find an ϵ-SOSP. We then prove that perturbed gradient descent is robust to the noise of O(ϵ^6/d^4) and O(ϵ/d^0.5+ζ) for ζ>0 on the zeroth- and first-order oracles, respectively, which provides a quantum algorithm with poly-logarithmic query complexity. We then propose a stochastic gradient descent algorithm using quantum mean estimation on the Gaussian smoothing of noisy oracles, which is robust to O(ϵ^1.5/d) and O(ϵ/√(d)) noise on the zeroth- and first-order oracles, respectively. The quantum algorithm takes O(d^2.5/ϵ^3.5) and O(d^2/ϵ^3) queries to the two oracles, giving a polynomial speedup over the classical counterparts. Moreover, we characterize the domains where quantum algorithms can find an ϵ-SOSP with poly-logarithmic, polynomial, or exponential number of queries in d, or the problem is information-theoretically unsolvable even by an infinite number of queries. In addition, we prove an Ω(ϵ^-12/7) lower bound in ϵ for any randomized classical and quantum algorithm to find an ϵ-SOSP using either noisy zeroth- or first-order oracles.


page 1

page 2

page 3

page 4


Quantum Algorithms for Escaping from Saddle Points

We initiate the study of quantum algorithms for escaping from saddle poi...

Optimizing quantum optimization algorithms via faster quantum gradient computation

We consider a generic framework of optimization algorithms based on grad...

Quantum Lower Bounds for Finding Stationary Points of Nonconvex Functions

Quantum algorithms for optimization problems are of general interest. De...

No quantum speedup over gradient descent for non-smooth convex optimization

We study the first-order convex optimization problem, where we have blac...

Quantum Algorithms for Sampling Log-Concave Distributions and Estimating Normalizing Constants

Given a convex function fℝ^d→ℝ, the problem of sampling from a distribut...

Chaos and Complexity from Quantum Neural Network: A study with Diffusion Metric in Machine Learning

In this work, our prime objective is to study the phenomena of quantum c...

The Complexity of NISQ

The recent proliferation of NISQ devices has made it imperative to under...

Please sign up or login with your details

Forgot password? Click here to reset