Robustly estimating the marginal likelihood for cognitive models via importance sampling

by   Minh-Ngoc Tran, et al.

Recent advances in Markov chain Monte Carlo (MCMC) extend the scope of Bayesian inference to models for which the likelihood function is intractable. Although these developments allow us to estimate model parameters, other basic problems such as estimating the marginal likelihood, a fundamental tool in Bayesian model selection, remain challenging. This is an important scientific limitation because testing psychological hypotheses with hierarchical models has proven difficult with current model selection methods. We propose an efficient method for estimating the marginal likelihood for models where the likelihood is intractable, but can be estimated unbiasedly. It is based on first running a sampling method such as MCMC to obtain proposals for the model parameters, and then using these proposals in an importance sampling (IS) framework with an unbiased estimate of the likelihood. Our method has several attractive properties: it generates an unbiased estimate of the marginal likelihood, it is robust to the quality and target of the sampling method used to form the IS proposals, and it is computationally cheap to estimate the variance of the marginal likelihood estimator. We also obtain the convergence properties of the method and provide guidelines on maximizing computational efficiency. The method is illustrated in two challenging cases involving hierarchical models: identifying the form of individual differences in an applied choice scenario, and evaluating the best parameterization of a cognitive model in a speeded decision making context. Freely available code to implement the methods is provided. Extensions to posterior moment estimation and parallelization are also discussed.


New Estimation Approaches for the Linear Ballistic Accumulator Model

The Linear Ballistic Accumulator (LBA) model of Brown (2008) is used as ...

A correlated pseudo-marginal approach to doubly intractable problems

Doubly intractable models are encountered in a number of fields, e.g. so...

Bayesian Inference for Gaussian Process Classifiers with Annealing and Pseudo-Marginal MCMC

Kernel methods have revolutionized the fields of pattern recognition and...

An Exact Sampler for Inference after Polyhedral Model Selection

Inference after model selection presents computational challenges when d...

Exact Subsampling MCMC

Speeding up Markov Chain Monte Carlo (MCMC) for data sets with many obse...

Learned harmonic mean estimation of the marginal likelihood with normalizing flows

Computing the marginal likelihood (also called the Bayesian model eviden...

Bayesian Integrals on Toric Varieties

We explore the positive geometry of statistical models in the setting of...

Please sign up or login with your details

Forgot password? Click here to reset