No-Trick (Treat) Kernel Adaptive Filtering using Deterministic Features

12/10/2019
by   Kan Li, et al.
44

Kernel methods form a powerful, versatile, and theoretically-grounded unifying framework to solve nonlinear problems in signal processing and machine learning. The standard approach relies on the kernel trick to perform pairwise evaluations of a kernel function, which leads to scalability issues for large datasets due to its linear and superlinear growth with respect to the training data. A popular approach to tackle this problem, known as random Fourier features (RFFs), samples from a distribution to obtain the data-independent basis of a higher finite-dimensional feature space, where its dot product approximates the kernel function. Recently, deterministic, rather than random construction has been shown to outperform RFFs, by approximating the kernel in the frequency domain using Gaussian quadrature. In this paper, we view the dot product of these explicit mappings not as an approximation, but as an equivalent positive-definite kernel that induces a new finite-dimensional reproducing kernel Hilbert space (RKHS). This opens the door to no-trick (NT) online kernel adaptive filtering (KAF) that is scalable and robust. Random features are prone to large variances in performance, especially for smaller dimensions. Here, we focus on deterministic feature-map construction based on polynomial-exact solutions and show their superiority over random constructions. Without loss of generality, we apply this approach to classical adaptive filtering algorithms and validate the methodology to show that deterministic features are faster to generate and outperform state-of-the-art kernel methods based on random Fourier features.

READ FULL TEXT

page 2

page 3

page 4

page 5

page 6

page 7

page 8

page 10

research
01/01/2020

Fast Estimation of Information Theoretic Learning Descriptors using Explicit Inner Product Spaces

Kernel methods form a theoretically-grounded, powerful and versatile fra...
research
06/12/2016

Efficient KLMS and KRLS Algorithms: A Random Fourier Feature Perspective

We present a new framework for online Least Squares algorithms for nonli...
research
06/09/2015

On the Error of Random Fourier Features

Kernel methods give powerful, flexible, and theoretically grounded appro...
research
10/27/2016

On Bochner's and Polya's Characterizations of Positive-Definite Kernels and the Respective Random Feature Maps

Positive-definite kernel functions are fundamental elements of kernel me...
research
04/14/2020

Breaking the waves: asymmetric random periodic features for low-bitrate kernel machines

Many signal processing and machine learning applications are built from ...
research
11/11/2022

RFFNet: Scalable and interpretable kernel methods via Random Fourier Features

Kernel methods provide a flexible and theoretically grounded approach to...
research
11/25/2017

Stacked Kernel Network

Kernel methods are powerful tools to capture nonlinear patterns behind d...

Please sign up or login with your details

Forgot password? Click here to reset