Metric mean dimension and analog compression

12/02/2018
by   Yonatan Gutman, et al.
0

Wu and Verdú developed a theory of almost lossless analog compression, where one imposes various regularity conditions on the compressor and the decompressor with the input signal being modelled by a (typically infinite-entropy) stationary stochastic process. In this work we consider all stationary stochastic processes with trajectories in a prescribed set S⊂ [0,1]^Z of (bi)infinite sequences and find uniform lower and upper bounds for certain compression rates in terms of metric mean dimension, mean box dimension and mean Rényi information dimension. An essential tool is the recent Lindenstrauss-Tsukamoto variational principle expressing metric mean dimension in terms of rate-distortion functions. We obtain also lower bounds on compression rates for a fixed stationary process in terms of the rate-distortion function and information dimension rates and study several examples. We give a new formulation of the variational principle for metric mean dimension in terms of the Rényi information dimension.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro