On the amortized complexity of approximate counting

11/08/2022
by   Ishaq Aden-Ali, et al.
0

Naively storing a counter up to value n would require Ω(log n) bits of memory. Nelson and Yu [NY22], following work of [Morris78], showed that if the query answers need only be (1+ϵ)-approximate with probability at least 1 - δ, then O(loglog n + loglog(1/δ) + log(1/ϵ)) bits suffice, and in fact this bound is tight. Morris' original motivation for studying this problem though, as well as modern applications, require not only maintaining one counter, but rather k counters for k large. This motivates the following question: for k large, can k counters be simultaneously maintained using asymptotically less memory than k times the cost of an individual counter? That is to say, does this problem benefit from an improved amortized space complexity bound? We answer this question in the negative. Specifically, we prove a lower bound for nearly the full range of parameters showing that, in terms of memory usage, there is no asymptotic benefit possible via amortization when storing multiple counters. Our main proof utilizes a certain notion of "information cost" recently introduced by Braverman, Garg and Woodruff in FOCS 2020 to prove lower bounds for streaming algorithms.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset