# An Improved Classical Singular Value Transformation for Quantum Machine Learning

Quantum machine learning (QML) has shown great potential to produce large quantum speedups for linear algebra tasks. The quantum singular value transformation (QSVT), introduced by [GSLW, STOC'19, arXiv:1806.01838], is a unifying framework to obtain QML algorithms. We provide a classical algorithm that matches the performance of QSVT on low-rank inputs, up to small polynomial overhead. Under quantum memory assumptions, given a bounded matrix A∈ℂ^m× n, vector b∈ℂ^n, and bounded degree-d polynomial p, QSVT can output a measurement from the state |p(A)b⟩ in O(dA_F) time after linear-time pre-processing. We show that, in the same setting, for any ε>0, we can output a vector v such that v - p(A) b≤εb in O(d^9A_F^4/ε^2) time after linear-time pre-processing. This improves upon the best known classical algorithm [CGLLTW, STOC'20, arXiv:1910.06151], which requires O(d^22A_F^6/ε^6) time. Instantiating the aforementioned algorithm with different polynomials, we obtain fast quantum-inspired algorithms for regression, recommendation systems, and Hamiltonian simulation. We improve in numerous parameter settings on prior work, including those that use problem-specialized approaches. Our key insight is to combine the Clenshaw recurrence, an iterative method for computing matrix polynomials, with sketching techniques to simulate QSVT classically. The tools we introduce in this work include (a) a matrix sketch for approximately preserving bi-linear forms, (b) an asymmetric approximate matrix product sketch based on ℓ_2^2 sampling, (c) a new stability analysis for the Clenshaw recurrence, and (d) a new technique to bound arithmetic progressions of the coefficients appearing in the Chebyshev series expansion of bounded functions, each of which may be of independent interest.

READ FULL TEXT