About the Non-Convex Optimization Problem Induced by Non-positive Semidefinite Kernel Learning

10/26/2020
by   Katharina Morik, et al.
0

During the last years, kernel based methods proved to be very successful for many real-world learning problems. One of the main reasons for this success is the efficiency on large data sets which is a result of the fact that kernel methods like support vector machines (SVM) are based on a convex optimization problem. Solving a new learning problem can now often be reduced to the choice of an appropriate kernel function and kernel parameters. However, it can be shown that even the most powerful kernel methods can still fail on quite simple data sets in cases where the inherent feature space induced by the used kernel function is not sufficient. In these cases, an explicit feature space transformation or detection of latent variables proved to be more successful. Since such an explicit feature construction is often not feasible for large data sets, the ultimate goal for efficient kernel learning would be the adaptive creation of new and appropriate kernel functions. It can, however, not be guaranteed that such a kernel function still leads to a convex optimization problem for Support Vector Machines. Therefore, we have to enhance the optimization core of the learning method itself before we can use it with arbitrary, i.e., non-positive semidefinite, kernel functions. This article motivates the usage of appropriate feature spaces and discusses the possible consequences leading to non-convex optimization problems. We will show that these new non-convex optimization SVM are at least as accurate as their quadratic programming counterparts on eight real-world benchmark data sets in terms of the generalization performance. They always outperform traditional approaches in terms of the original optimization problem. Additionally, the proposed algorithm is more generic than existing traditional solutions since it will also work for non-positive semidefinite or indefinite kernel functions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/16/2014

MKL-RT: Multiple Kernel Learning for Ratio-trace Problems via Convex Optimization

In the recent past, automatic selection or combination of kernels (or fe...
research
12/04/2012

Training Support Vector Machines Using Frank-Wolfe Optimization Methods

Training a Support Vector Machine (SVM) requires the solution of a quadr...
research
09/18/2017

A Summary Of The Kernel Matrix, And How To Learn It Effectively Using Semidefinite Programming

Kernel-based learning algorithms are widely used in machine learning for...
research
03/09/2013

Complex Support Vector Machines for Regression and Quaternary Classification

The paper presents a new framework for complex Support Vector Regression...
research
03/02/2017

Optimization of distributions differences for classification

In this paper we introduce a new classification algorithm called Optimiz...
research
08/31/2019

Conditions for Unnecessary Logical Constraints in Kernel Machines

A main property of support vector machines consists in the fact that onl...

Please sign up or login with your details

Forgot password? Click here to reset