Exact asymptotics for phase retrieval and compressed sensing with random generative priors

12/04/2019
by   Benjamin Aubin, et al.
5

We consider the problem of compressed sensing and of (real-valued) phase retrieval with random measurement matrix. We derive sharp asymptotics for the information-theoretically optimal performance and for the best known polynomial algorithm for an ensemble of generative priors consisting of fully connected deep neural networks with random weight matrices and arbitrary activations. We compare the performance to sparse separable priors and conclude that generative priors might be advantageous in terms of algorithmic performance. In particular, while sparsity does not allow to perform compressive phase retrieval efficiently close to its information-theoretic limit, it is found that under the random generative prior compressed phase retrieval becomes tractable.

READ FULL TEXT
research
08/24/2020

Compressive Phase Retrieval: Optimal Sample Complexity with Deep Generative Priors

Advances in compressive sensing provided reconstruction algorithms of sp...
research
06/07/2020

Constant-Expansion Suffices for Compressed Sensing with Generative Priors

Generative neural networks have been empirically found very promising in...
research
06/20/2019

Algorithmic Guarantees for Inverse Imaging with Untrained Network Priors

Deep neural networks as image priors have been recently introduced for p...
research
08/04/2023

Generative Image Priors for MRI Reconstruction Trained from Magnitude-Only Images

Purpose: In this work, we present a workflow to construct generic and ro...
research
05/24/2019

On the Global Minimizers of Real Robust Phase Retrieval with Sparse Noise

We study a class of real robust phase retrieval problems under a Gaussia...
research
07/11/2018

Phase Retrieval Under a Generative Prior

The phase retrieval problem asks to recover a natural signal y_0 ∈R^n fr...
research
08/24/2017

Bayesian Compressive Sensing Using Normal Product Priors

In this paper, we introduce a new sparsity-promoting prior, namely, the ...

Please sign up or login with your details

Forgot password? Click here to reset