EP-GIG Priors and Applications in Bayesian Sparse Learning

04/19/2012
by   Zhihua Zhang, et al.
0

In this paper we propose a novel framework for the construction of sparsity-inducing priors. In particular, we define such priors as a mixture of exponential power distributions with a generalized inverse Gaussian density (EP-GIG). EP-GIG is a variant of generalized hyperbolic distributions, and the special cases include Gaussian scale mixtures and Laplace scale mixtures. Furthermore, Laplace scale mixtures can subserve a Bayesian framework for sparse learning with nonconvex penalization. The densities of EP-GIG can be explicitly expressed. Moreover, the corresponding posterior distribution also follows a generalized inverse Gaussian distribution. These properties lead us to EM algorithms for Bayesian sparse learning. We show that these algorithms bear an interesting resemblance to iteratively re-weighted ℓ_2 or ℓ_1 methods. In addition, we present two extensions for grouped variable selection and logistic regression.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/25/2023

Model selection-based estimation for generalized additive models using mixtures of g-priors: Towards systematization

We consider estimation of generalized additive models using basis expans...
research
07/25/2011

Generalized Beta Mixtures of Gaussians

In recent years, a rich variety of shrinkage priors have been proposed t...
research
02/24/2021

Sparse online variational Bayesian regression

This work considers variational Bayesian inference as an inexpensive and...
research
09/11/2011

The Bayesian Bridge

We propose the Bayesian bridge estimator for regularized regression and ...
research
10/14/2020

Flexible mean field variational inference using mixtures of non-overlapping exponential families

Sparse models are desirable for many applications across diverse domains...
research
12/05/2021

Laplace Power-expected-posterior priors for generalized linear models with applications to logistic regression

Power-expected-posterior (PEP) methodology, which borrows ideas from the...

Please sign up or login with your details

Forgot password? Click here to reset