The Advantage of Cross Entropy over Entropy in Iterative Information Gathering

09/26/2014
by   Johannes Kulick, et al.
0

Gathering the most information by picking the least amount of data is a common task in experimental design or when exploring an unknown environment in reinforcement learning and robotics. A widely used measure for quantifying the information contained in some distribution of interest is its entropy. Greedily minimizing the expected entropy is therefore a standard method for choosing samples in order to gain strong beliefs about the underlying random variables. We show that this approach is prone to temporally getting stuck in local optima corresponding to wrongly biased beliefs. We suggest instead maximizing the expected cross entropy between old and new belief, which aims at challenging refutable beliefs and thereby avoids these local optima. We show that both criteria are closely related and that their difference can be traced back to the asymmetry of the Kullback-Leibler divergence. In illustrative examples as well as simulated and real-world experiments we demonstrate the advantage of cross entropy over simple entropy for practical applications.

READ FULL TEXT

page 16

page 18

research
08/15/2022

Rényi Cross-Entropy Measures for Common Distributions and Processes with Memory

Two Rényi-type generalizations of the Shannon cross-entropy, the Rényi c...
research
02/27/2013

Generating New Beliefs From Old

In previous work [BGHK92, BGHK93], we have studied the random-worlds app...
research
06/17/2016

Balancing New Against Old Information: The Role of Surprise in Learning

Surprise describes a range of phenomena from unexpected events to behavi...
research
05/29/2023

Cross-Entropy Estimators for Sequential Experiment Design with Reinforcement Learning

Reinforcement learning can effectively learn amortised design policies f...
research
04/29/2021

Constructions in combinatorics via neural networks

We demonstrate how by using a reinforcement learning algorithm, the deep...
research
01/05/2019

A Scale-invariant Generalization of Renyi Entropy and Related Optimizations under Tsallis' Nonextensive Framework

Entropy and cross-entropy are two very fundamental concepts in informati...
research
08/19/2015

Introduction to Cross-Entropy Clustering The R Package CEC

The R Package CEC performs clustering based on the cross-entropy cluster...

Please sign up or login with your details

Forgot password? Click here to reset