Low Complexity Gaussian Latent Factor Models and a Blessing of Dimensionality

06/11/2017
by   Greg Ver Steeg, et al.
0

Learning the structure of graphical models from data is a fundamental problem that typically carries a curse of dimensionality. We consider a special class of Gaussian latent factor models where each observed variable depends on at most one of a set of latent variables. We derive information-theoretic lower bounds on the sample complexity for structure recovery that suggest a blessing of dimensionality. With a fixed number of samples, structure recovery for this class using existing methods deteriorates with increasing dimension. We design a new approach to learning Gaussian latent factor models with low computational complexity that empirically benefits from dimensionality. Our approach relies on an information-theoretic constraint to find parsimonious solutions without adding regularizers or sparsity hyper-parameters. Besides improved structure recovery, we also show that we are able to outperform state-of-the-art approaches for covariance estimation on both synthetic data and on under-sampled, high-dimensional stock market data.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro