Information Measure Similarity Theory: Message Importance Measure via Shannon Entropy

01/04/2019
by   Rui She, et al.
0

Rare events attract more attention and interests in many scenarios of big data such as anomaly detection and security systems. To characterize the rare events importance from probabilistic perspective, the message importance measure (MIM) is proposed as a kind of semantics analysis tool. Similar to Shannon entropy, the MIM has its special functional on information processing, in which the parameter ϖ of MIM plays a vital role. Actually, the parameter ϖ dominates the properties of MIM, based on which the MIM has three work regions where the corresponding parameters satisfy 0 <ϖ< 2/{p(x_i)}, ϖ > 2/{p(x_i)} and ϖ < 0 respectively. Furthermore, in the case 0 <ϖ< 2/{p(x_i)}, there are some similarity between the MIM and Shannon entropy in the information compression and transmission, which provide a new viewpoint for information theory. This paper first constructs a system model with message importance measure and proposes the message importance loss to enrich the information processing strategies. Moreover, we propose the message importance loss capacity to measure the information importance harvest in a transmission. Furthermore, the message importance distortion function is presented to give an upper bound of information compression based on message importance measure. Additionally, the bitrate transmission constrained by the message importance loss is investigated to broaden the scope for Shannon information theory.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset