Cumulative Memory Lower Bounds for Randomized and Quantum Computation

01/13/2023
by   Paul Beame, et al.
0

Cumulative memory – the sum of space used over the steps of a computation – is a fine-grained measure of time-space complexity that is a more accurate measure of cost for algorithms with infrequent spikes in memory usage in the context of technologies such as cloud computing that allow dynamic allocation and de-allocation of resources during their execution. We give the first lower bounds on cumulative memory complexity that apply to general sequential classical algorithms. We also prove the first such bounds for bounded-error quantum circuits. Among many possible applications, we show that any classical sorting algorithm with success probability at least 1/poly(n) requires cumulative memory Ω̃(n^2), any classical matrix multiplication algorithm requires cumulative memory Ω(n^6/T), any quantum sorting circuit requires cumulative memory Ω(n^3/T), and any quantum circuit that finds k disjoint collisions in a random function requires cumulative memory Ω(k^3n/T^2). More generally, we present theorems that can be used to convert a wide class of existing time-space tradeoff lower bounds to matching lower bounds on cumulative memory complexity.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset