Calibrating Model-Based Inferences and Decisions

03/22/2018
by   Michael Betancourt, et al.
0

As the frontiers of applied statistics progress through increasingly complex experiments we must exploit increasingly sophisticated inferential models to analyze the observations we make. In order to avoid misleading or outright erroneous inferences we then have to be increasingly diligent in scrutinizing the consequences of those modeling assumptions. Fortunately model-based methods of statistical inference naturally define procedures for quantifying the scope of inferential outcomes and calibrating corresponding decision making processes. In this paper I review the construction and implementation of the particular procedures that arise within frequentist and Bayesian methodologies.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset