Optimal Bias Correction of the Log-periodogram Estimator of the Fractional Parameter: A Jackknife Approach

03/19/2019
by   Kanchana Nadarajah, et al.
0

We use the jackknife to bias correct the log-periodogram regression (LPR) estimator of the fractional parameter, d, in a stationary fractionally integrated model. The weights used to construct the jackknife estimator are chosen such that bias reduction occurs to an order of n^-α (where n is the sample size) for some 0<α<1, while the increase in variance is minimized - with the weights viewed as 'optimal' in this sense. We show that under regularity, the bias-corrected estimator is consistent and asymptotically normal with the same asymptotic variance and n^α/2 rate of convergence as the original LPR estimator. In other words, the use of optimal weights enables bias reduction to be achieved without the usual increase in asymptotic variance being incurred. These theoretical results are valid under both the non-overlapping and moving-block sub-sampling schemes that can be used in the jackknife technique, and do not require the assumption of Gaussianity for the data generating process. A Monte Carlo study explores the finite sample performance of different versions of the optimal jackknife estimator under a variety of data generating processes, including alternative specifications for the short memory dynamics. The comparators in the simulation exercise are the raw (unadjusted) LPR estimator and two alternative bias-adjusted estimators, namely the weighted-average estimator of Guggenberger and Sun (2006) and the pre-filtered sieve bootstrap-based estimator of Poskitt, Martin and Grose (2016). The paper concludes with some discussion of open issues and possible extensions to the work.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset