Neural Networks Optimally Compress the Sawbridge

11/10/2020
by   Aaron B. Wagner, et al.
0

Neural-network-based compressors have proven to be remarkably effective at compressing sources, such as images, that are nominally high-dimensional but presumed to be concentrated on a low-dimensional manifold. We consider a continuous-time random process that models an extreme version of such a source, wherein the realizations fall along a one-dimensional "curve" in function space that has infinite-dimensional linear span. We precisely characterize the optimal entropy-distortion tradeoff for this source and show numerically that it is achieved by neural-network-based compressors trained via stochastic gradient descent. In contrast, we show both analytically and experimentally that compressors based on the classical Karhunen-Loève transform are highly suboptimal at high rates.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset