Linear-time Learning on Distributions with Approximate Kernel Embeddings

09/24/2015
by   Dougal J. Sutherland, et al.
0

Many interesting machine learning problems are best posed by considering instances that are distributions, or sample sets drawn from distributions. Previous work devoted to machine learning tasks with distributional inputs has done so through pairwise kernel evaluations between pdfs (or sample sets). While such an approach is fine for smaller datasets, the computation of an N × N Gram matrix is prohibitive in large datasets. Recent scalable estimators that work over pdfs have done so only with kernels that use Euclidean metrics, like the L_2 distance. However, there are a myriad of other useful metrics available, such as total variation, Hellinger distance, and the Jensen-Shannon divergence. This work develops the first random features for pdfs whose dot product approximates kernels using these non-Euclidean metrics, allowing estimators using such kernels to scale to large datasets by working in a primal space, without computing large Gram matrices. We provide an analysis of the approximation error in using our proposed random features and show empirically the quality of our approximation both in estimating a Gram matrix and in solving learning tasks in real-world and synthetic data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/29/2015

Bayesian Nonparametric Kernel-Learning

Kernel methods are ubiquitous tools in machine learning. They have prove...
research
06/26/2023

Tanimoto Random Features for Scalable Molecular Machine Learning

The Tanimoto coefficient is commonly used to measure the similarity betw...
research
07/12/2016

Nystrom Method for Approximating the GMM Kernel

The GMM (generalized min-max) kernel was recently proposed (Li, 2016) as...
research
02/01/2012

Kernels on Sample Sets via Nonparametric Divergence Estimates

Most machine learning algorithms, such as classification or regression, ...
research
01/09/2023

Dimensionality Reduction for Persistent Homology with Gaussian Kernels

Computing persistent homology using Gaussian kernels is useful in the do...
research
02/05/2020

Wasserstein Exponential Kernels

In the context of kernel methods, the similarity between data points is ...
research
09/29/2016

Kernel Methods on Approximate Infinite-Dimensional Covariance Operators for Image Classification

This paper presents a novel framework for visual object recognition usin...

Please sign up or login with your details

Forgot password? Click here to reset