Tight Bounds for the Subspace Sketch Problem with Applications

04/11/2019
by   Yi Li, et al.
0

In the subspace sketch problem one is given an n× d matrix A with O((nd)) bit entries, and would like to compress it in an arbitrary way to build a small space data structure Q_p, so that for any given x ∈R^d, with probability at least 2/3, one has Q_p(x)=(1±ϵ)Ax_p, where p≥ 0, and where the randomness is over the construction of Q_p. The central question is: How many bits are necessary to store Q_p? This problem has applications to the communication of approximating the number of non-zeros in a matrix product, the size of coresets in projective clustering, the memory of streaming algorithms for regression in the row-update model, and embedding subspaces of L_p in functional analysis. A major open question is the dependence on the approximation factor ϵ. We show if p≥ 0 is not a positive even integer and d=Ω((1/ϵ)), then Ω̃(ϵ^-2d) bits are necessary. On the other hand, if p is a positive even integer, then there is an upper bound of O(d^p(nd)) bits independent of ϵ. Our results are optimal up to logarithmic factors, and show in particular that one cannot compress A to O(d) "directions" v_1,...,v_O(d), such that for any x, Ax_1 can be well-approximated from 〈 v_1,x〉,...,〈 v_O(d),x〉. Our lower bound rules out arbitrary functions of these inner products (and in fact arbitrary data structures built from A), and thus rules out the possibility of a singular value decomposition for ℓ_1 in a very strong sense. Indeed, as ϵ→ 0, for p = 1 the space complexity becomes arbitrarily large, while for p = 2 it is at most O(d^2 (nd)). As corollaries of our main lower bound, we obtain new lower bounds for a wide range of applications, including the above, which in many cases are optimal.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro