An improved quantum algorithm for low-rank rigid linear regressions with vector solution outputs

01/15/2023
by   Changpeng Shao, et al.
0

Let A∈ℝ^n× d, ∈̱ℝ^n and λ>0, for rigid linear regression _ Z() = A-^2 + λ^2 ^2, we propose a quantum algorithm, in the framework of block-encoding, that returns a vector solution _ opt such that Z(_ opt) ≤ (1+ε) Z(_ opt), where _ opt is an optimal solution. If a block-encoding of A is constructed in time O(T), then the cost of the quantum algorithm is roughly O(√(d)/ε^1.5 + d/ε) when A is low-rank and n=O(d). Here =Tα/λ and α is a normalization parameter such that A/α is encoded in a unitary through the block-encoding. This can be more efficient than naive quantum algorithms using quantum linear solvers and quantum tomography or amplitude estimation, which usually cost O( d/ε). The main technique we use is a quantum accelerated version of leverage score sampling, which may have other applications. The speedup of leverage score sampling can be quadratic or even exponential in certain cases. As a byproduct, we propose an improved randomized classical algorithm for rigid linear regressions. Finally, we show some lower bounds on performing leverage score sampling and solving linear regressions on a quantum computer.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset