Quantization Algorithms for Random Fourier Features

by   Xiaoyun Li, et al.

The method of random projection (RP) is the standard technique in machine learning and many other areas, for dimensionality reduction, approximate near neighbor search, compressed sensing, etc. Basically, RP provides a simple and effective scheme for approximating pairwise inner products and Euclidean distances in massive data. Closely related to RP, the method of random Fourier features (RFF) has also become popular, for approximating the Gaussian kernel. RFF applies a specific nonlinear transformation on the projected data from random projections. In practice, using the (nonlinear) Gaussian kernel often leads to better performance than the linear kernel (inner product), partly due to the tuning parameter (γ) introduced in the Gaussian kernel. Recently, there has been a surge of interest in studying properties of RFF. After random projections, quantization is an important step for efficient data storage, computation, and transmission. Quantization for RP has also been extensive studied in the literature. In this paper, we focus on developing quantization algorithms for RFF. The task is in a sense challenging due to the tuning parameter γ in the Gaussian kernel. For example, the quantizer and the quantized data might be tied to each specific tuning parameter γ. Our contribution begins with an interesting discovery, that the marginal distribution of RFF is actually free of the Gaussian kernel parameter γ. This small finding significantly simplifies the design of the Lloyd-Max (LM) quantization scheme for RFF in that there would be only one LM quantizer for RFF (regardless of γ). We also develop a variant named LM^2-RFF quantizer, which in certain cases is more accurate. Experiments confirm that the proposed quantization schemes perform well.


page 1

page 2

page 3

page 4


Sign Stable Random Projections for Large-Scale Learning

We study the use of "sign α-stable random projections" (where 0<α≤ 2) fo...

2-Bit Random Projections, NonLinear Estimators, and Approximate Near Neighbor Search

The method of random projections has become a standard tool for machine ...

Breaking the waves: asymmetric random periodic features for low-bitrate kernel machines

Many signal processing and machine learning applications are built from ...

Random matrices in service of ML footprint: ternary random features with no performance loss

In this article, we investigate the spectral behavior of random features...

Ternary and Binary Quantization for Improved Classification

Dimension reduction and data quantization are two important methods for ...

Linearized GMM Kernels and Normalized Random Fourier Features

The method of "random Fourier features (RFF)" has become a popular tool ...

Efficient KLMS and KRLS Algorithms: A Random Fourier Feature Perspective

We present a new framework for online Least Squares algorithms for nonli...

Please sign up or login with your details

Forgot password? Click here to reset