Some Information Inequalities for Statistical Inference

02/13/2018
by   Harsha K V, et al.
0

In this paper, we first describe the generalized notion of Cramer-Rao lower bound obtained by Naudts (2004) using two families of probability density functions, the original model and an escort model. We reinterpret the results in Naudts (2004) from a statistical point of view and obtain some interesting examples in which this bound is attained. Further we obtain information inequalities which generalize the classical Bhattacharyya bounds in both regular and non-regular cases.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/02/2021

Hybrid and Generalized Bayesian Cramér-Rao Inequalities via Information Geometry

Information geometry is the study of statistical models from a Riemannia...
research
04/06/2023

Some families of Jensen-like inequalities with application to information theory

It is well known that the traditional Jensen inequality is proved by low...
research
08/29/2018

Asymmetry of copulas arising from shock models

When choosing the right copula for our data a key point is to distinguis...
research
11/14/2020

Inference Functions for Semiparametric Models

The paper discusses inference techniques for semiparametric models based...
research
06/16/2020

Decomposable Families of Itemsets

The problem of selecting a small, yet high quality subset of patterns fr...
research
10/24/2019

Wasserstein information matrix

We study the information matrix for statistical models by L^2-Wasserstei...
research
04/01/2020

An approximate KLD based experimental design for models with intractable likelihoods

Data collection is a critical step in statistical inference and data sci...

Please sign up or login with your details

Forgot password? Click here to reset