Supervised learning with probabilistic morphisms and kernel mean embeddings

05/10/2023
โˆ™
by   Hรดng Vรขn Lรช, et al.
โˆ™
0
โˆ™

In this paper I propose a concept of a correct loss function in a generative model of supervised learning for an input space ๐’ณ and a label space ๐’ด, both of which are measurable spaces. A correct loss function in a generative model of supervised learning must accurately measure the discrepancy between elements of a hypothesis space โ„‹ of possible predictors and the supervisor operator, even when the supervisor operator does not belong to โ„‹. To define correct loss functions, I propose a characterization of a regular conditional probability measure ฮผ_๐’ด|๐’ณ for a probability measure ฮผ on ๐’ณร—๐’ด relative to the projection ฮ _๐’ณ: ๐’ณร—๐’ดโ†’๐’ณ as a solution of a linear operator equation. If ๐’ด is a separable metrizable topological space with the Borel ฯƒ-algebra โ„ฌ (๐’ด), I propose an additional characterization of a regular conditional probability measure ฮผ_๐’ด|๐’ณ as a minimizer of mean square error on the space of Markov kernels, referred to as probabilistic morphisms, from ๐’ณ to ๐’ด. This characterization utilizes kernel mean embeddings. Building upon these results and employing inner measure to quantify the generalizability of a learning algorithm, I extend a result due to Cucker-Smale, which addresses the learnability of a regression model, to the setting of a conditional probability estimation problem. Additionally, I present a variant of Vapnik's regularization method for solving stochastic ill-posed problems, incorporating inner measure, and showcase its applications.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
โˆ™ 03/07/2023

Manually Selecting The Data Function for Supervised Learning of small datasets

Supervised learning problems may become ill-posed when there is a lack o...
research
โˆ™ 02/12/2023

Recursive Estimation of Conditional Kernel Mean Embeddings

Kernel mean embeddings, a widely used technique in machine learning, map...
research
โˆ™ 02/10/2020

A Measure-Theoretic Approach to Kernel Conditional Mean Embeddings

We present a new operator-free, measure-theoretic definition of the cond...
research
โˆ™ 07/05/2021

The information loss of a stochastic map

We provide a stochastic extension of the Baez-Fritz-Leinster characteriz...
research
โˆ™ 09/06/2023

Contrastive Learning as Kernel Approximation

In standard supervised machine learning, it is necessary to provide a la...
research
โˆ™ 07/09/2021

Batch Inverse-Variance Weighting: Deep Heteroscedastic Regression

Heteroscedastic regression is the task of supervised learning where each...
research
โˆ™ 06/11/2020

Conditional Sampling With Monotone GANs

We present a new approach for sampling conditional measures that enables...

Please sign up or login with your details

Forgot password? Click here to reset