Image Segmentation by Iterative Inference from Conditional Score Estimation

by   Adriana Romero, et al.

Inspired by the combination of feedforward and iterative computations in the virtual cortex, and taking advantage of the ability of denoising autoencoders to estimate the score of a joint distribution, we propose a novel approach to iterative inference for capturing and exploiting the complex joint distribution of output variables conditioned on some input variables. This approach is applied to image pixel-wise segmentation, with the estimated conditional score used to perform gradient ascent towards a mode of the estimated conditional distribution. This extends previous work on score estimation by denoising autoencoders to the case of a conditional distribution, with a novel use of a corrupted feedforward predictor replacing Gaussian corruption. An advantage of this approach over more classical ways to perform iterative inference for structured outputs, like conditional random fields (CRFs), is that it is not any more necessary to define an explicit energy function linking the output variables. To keep computations tractable, such energy function parametrizations are typically fairly constrained, involving only a few neighbors of each of the output variables in each clique. We experimentally find that the proposed iterative inference from conditional score estimation by conditional denoising autoencoders performs better than comparable models based on CRFs or those not using any explicit modeling of the conditional joint distribution of outputs.


Learning Approximate Inference Networks for Structured Prediction

Structured prediction energy networks (SPENs; Belanger & McCallum 2016) ...

An Introduction to Conditional Random Fields

Often we wish to predict a large number of variables that depend on each...

Deep Gaussian Conditional Random Field Network: A Model-based Deep Network for Discriminative Denoising

We propose a novel deep network architecture for image denoising based ...

Generative Class-conditional Autoencoders

Recent work by Bengio et al. (2013) proposes a sampling procedure for de...

Deep Energy Estimator Networks

Density estimation is a fundamental problem in statistical learning. Thi...

Kernel Regression by Mode Calculation of the Conditional Probability Distribution

The most direct way to express arbitrary dependencies in datasets is to ...

Dynamic Scale Inference by Entropy Minimization

Given the variety of the visual world there is not one true scale for re...

Please sign up or login with your details

Forgot password? Click here to reset