Higher-order asymptotics for the parametric complexity

10/01/2015
by   James G. Dowty, et al.
0

The parametric complexity is the key quantity in the minimum description length (MDL) approach to statistical model selection. Rissanen and others have shown that the parametric complexity of a statistical model approaches a simple function of the Fisher information volume of the model as the sample size n goes to infinity. This paper derives higher-order asymptotic expansions for the parametric complexity, in the case of exponential families and independent and identically distributed data. These higher-order approximations are calculated for some examples and are shown to have better finite-sample behaviour than Rissanen's approximation. The higher-order terms are given as expressions involving cumulants (or, more naturally, the Amari-Chentsov tensors), and these terms are likely to be interesting in themselves since they arise naturally from the general information-theoretic principles underpinning MDL. The derivation given here specializes to an alternative and arguably simpler proof of Rissanen's result (for the case considered here), proving for the first time that his approximation is O(n^-1).

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset