On cumulative residual (past) inaccuracy for truncated random variables
To overcome the drawbacks of Shannon's entropy, the concept of cumulative residual and past entropy has been proposed in the information theoretic literature. Furthermore, the Shannon entropy has been generalized in a number of different ways by many researchers. One important extension is Kerridge inaccuracy measure. In the present communication we study the cumulative residual and past inaccuracy measures, which are extensions of the corresponding cumulative entropies. Several properties, including monotonicity and bounds, are obtained for left, right and doubly truncated random variables.
READ FULL TEXT