Iterative Alpha Expansion for estimating gradient-sparse signals from linear measurements

05/15/2019
by   Sheng Xu, et al.
0

We consider estimating a piecewise-constant image, or a gradient-sparse signal on a general graph, from noisy linear measurements. We propose and study an iterative algorithm to minimize a penalized least-squares objective, with a penalty given by the "l_0-norm" of the signal's discrete graph gradient. The method proceeds by approximate proximal descent, applying the alpha-expansion procedure to minimize a proximal gradient in each iteration, and using a geometric decay of the penalty parameter across iterations. Under a cut-restricted isometry property for the measurement design, we prove global recovery guarantees for the estimated signal. For standard Gaussian designs, the required number of measurements is independent of the graph structure, and improves upon worst-case guarantees for total-variation (TV) compressed sensing on the 1-D and 2-D lattice graphs by polynomial and logarithmic factors, respectively. The method empirically yields lower mean-squared recovery error compared with TV regularization in regimes of moderate undersampling and moderate to high signal-to-noise, for several examples of changepoint signals and gradient-sparse phantom images.

READ FULL TEXT

page 3

page 12

page 13

page 16

page 17

research
01/28/2013

Guarantees of Total Variation Minimization for Signal Recovery

In this paper, we consider using total variation minimization to recover...
research
12/23/2018

Performance Bounds For Co-/Sparse Box Constrained Signal Recovery

The recovery of structured signals from a few linear measurements is a c...
research
10/11/2012

Near-optimal compressed sensing guarantees for total variation minimization

Consider the problem of reconstructing a multidimensional signal from an...
research
01/27/2020

Compressed Sensing with 1D Total Variation: Breaking Sample Complexity Barriers via Non-Uniform Recovery

This paper investigates total variation minimization in one spatial dime...
research
10/19/2020

Learning to solve TV regularized problems with unrolled algorithms

Total Variation (TV) is a popular regularization strategy that promotes ...
research
02/09/2015

Projected Nesterov's Proximal-Gradient Algorithm for Sparse Signal Reconstruction with a Convex Constraint

We develop a projected Nesterov's proximal-gradient (PNPG) approach for ...
research
09/08/2019

Living near the edge: A lower-bound on the phase transition of total variation minimization

This work is about the total variation (TV) minimization which is used f...

Please sign up or login with your details

Forgot password? Click here to reset