A Probabilistic Framework for Deep Learning

12/06/2016
by   Ankit B. Patel, et al.
0

We develop a probabilistic framework for deep learning based on the Deep Rendering Mixture Model (DRMM), a new generative probabilistic model that explicitly capture variations in data due to latent task nuisance variables. We demonstrate that max-sum inference in the DRMM yields an algorithm that exactly reproduces the operations in deep convolutional neural networks (DCNs), providing a first principles derivation. Our framework provides new insights into the successes and shortcomings of DCNs as well as a principled route to their improvement. DRMM training via the Expectation-Maximization (EM) algorithm is a powerful alternative to DCN back-propagation, and initial training results are promising. Classification based on the DRMM and other variants outperforms DCNs in supervised digit classification, training 2-3x faster while achieving similar accuracy. Moreover, the DRMM is applicable to semi-supervised and unsupervised learning tasks, achieving results that are state-of-the-art in several categories on the MNIST benchmark and comparable to state of the art on the CIFAR10 benchmark.

READ FULL TEXT

page 21

page 23

page 25

research
12/06/2016

Semi-Supervised Learning with the Deep Rendering Mixture Model

Semi-supervised learning algorithms reduce the high cost of acquiring la...
research
04/02/2015

A Probabilistic Theory of Deep Learning

A grand challenge in machine learning is the development of computationa...
research
11/01/2018

Neural Rendering Model: Joint Generation and Prediction for Semi-Supervised Learning

Unsupervised and semi-supervised learning are important problems that ar...
research
12/02/2018

GAN-EM: GAN based EM learning framework

Expectation maximization (EM) algorithm is to find maximum likelihood so...
research
10/02/2020

Deep Expectation-Maximization for Semi-Supervised Lung Cancer Screening

We present a semi-supervised algorithm for lung cancer screening in whic...
research
06/28/2022

Semi-supervised Contrastive Outlier removal for Pseudo Expectation Maximization (SCOPE)

Semi-supervised learning is the problem of training an accurate predicti...
research
05/31/2022

Improvements to Supervised EM Learning of Shared Kernel Models by Feature Space Partitioning

Expectation maximisation (EM) is usually thought of as an unsupervised l...

Please sign up or login with your details

Forgot password? Click here to reset