Understanding Boltzmann Machine and Deep Learning via A Confident Information First Principle

02/16/2013
by   Xiaozhao Zhao, et al.
0

Typical dimensionality reduction methods focus on directly reducing the number of random variables while retaining maximal variations in the data. In this paper, we consider the dimensionality reduction in parameter spaces of binary multivariate distributions. We propose a general Confident-Information-First (CIF) principle to maximally preserve parameters with confident estimates and rule out unreliable or noisy parameters. Formally, the confidence of a parameter can be assessed by its Fisher information, which establishes a connection with the inverse variance of any unbiased estimate for the parameter via the Cramér-Rao bound. We then revisit Boltzmann machines (BM) and theoretically show that both single-layer BM without hidden units (SBM) and restricted BM (RBM) can be solidly derived using the CIF principle. This can not only help us uncover and formalize the essential parts of the target density that SBM and RBM capture, but also suggest that the deep neural network consisting of several layers of RBM can be seen as the layer-wise application of CIF. Guided by the theoretical analysis, we develop a sample-specific CIF-based contrastive divergence (CD-CIF) algorithm for SBM and a CIF-based iterative projection procedure (IP) for RBM. Both CD-CIF and IP are studied in a series of density estimation experiments.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/05/2015

A Confident Information First Principle for Parametric Reduction and Model Selection of Boltzmann Machines

Typical dimensionality reduction (DR) methods are often data-oriented, f...
research
09/18/2019

Data Mapping for Restricted Boltzmann Machine

Restricted Boltzmann machine (RBM) is interpreted as a data mapping betw...
research
03/10/2023

Product Jacobi-Theta Boltzmann machines with score matching

The estimation of probability density functions is a non trivial task th...
research
12/16/2020

Difficulty in estimating visual information from randomly sampled images

In this paper, we evaluate dimensionality reduction methods in terms of ...
research
09/11/2023

Data efficiency, dimensionality reduction, and the generalized symmetric information bottleneck

The Symmetric Information Bottleneck (SIB), an extension of the more fam...
research
03/09/2018

Construction of neural networks for realization of localized deep learning

The subject of deep learning has recently attracted users of machine lea...

Please sign up or login with your details

Forgot password? Click here to reset