The CM Algorithm for the Maximum Mutual Information Classifications of Unseen Instances

01/28/2019
by   Chenguang Lu, et al.
0

The Maximum Mutual Information (MMI) criterion is different from the Least Error Rate (LER) criterion. It can reduce failing to report small probability events. This paper introduces the Channels Matching (CM) algorithm for the MMI classifications of unseen instances. It also introduces some semantic information methods, which base the CM algorithm. In the CM algorithm, label learning is to let the semantic channel match the Shannon channel (Matching I) whereas classifying is to let the Shannon channel match the semantic channel (Matching II). We can achieve the MMI classifications by repeating Matching I and II. For low-dimensional feature spaces, we only use parameters to construct n likelihood functions for n different classes (rather than to construct partitioning boundaries as gradient descent) and expresses the boundaries by numerical values. Without searching in parameter spaces, the computation of the CM algorithm for low-dimensional feature spaces is very simple and fast. Using a two-dimensional example, we test the speed and reliability of the CM algorithm by different initial partitions. For most initial partitions, two iterations can make the mutual information surpass 99 The analysis indicates that for high-dimensional feature spaces, we may combine the CM algorithm with neural networks to improve the MMI classifications for faster and more reliable convergence.

READ FULL TEXT
research
05/02/2018

Semantic Channel and Shannon's Channel Mutually Match for Multi-Label Classification

A group of transition probability functions form a Shannon's channel whe...
research
03/22/2018

From Shannon's Channel to Semantic Channel via New Bayes' Formulas for Machine Learning

A group of transition probability functions form a Shannon's channel whe...
research
09/03/2018

From Bayesian Inference to Logical Bayesian Inference: A New Mathematical Frame for Semantic Communication and Machine Learning

Bayesian Inference (BI) uses the Bayes' posterior whereas Logical Bayesi...
research
05/23/2023

Reviewing Evolution of Learning Functions and Semantic Information Measures for Understanding Deep Learning

A new trend in deep learning, represented by Mutual Information Neural E...
research
04/24/2021

Calibrating LiDAR and Camera using Semantic Mutual information

We propose an algorithm for automatic, targetless, extrinsic calibration...
research
10/28/2019

Locality-Sensitive Hashing for f-Divergences: Mutual Information Loss and Beyond

Computing approximate nearest neighbors in high dimensional spaces is a ...
research
03/15/2018

2D Reconstruction of Small Intestine's Interior Wall

Examining and interpreting of a large number of wireless endoscopic imag...

Please sign up or login with your details

Forgot password? Click here to reset