Quantifying Learning Guarantees for Convex but Inconsistent Surrogates

10/26/2018
by   Kirill Struminsky, et al.
0

We study consistency properties of machine learning methods based on minimizing convex surrogates. We extend the recent framework of Osokin et al. (2017) for the quantitative analysis of consistency properties to the case of inconsistent surrogates. Our key technical contribution consists in a new lower bound on the calibration function for the quadratic surrogate, which is non-trivial (not always zero) for inconsistent cases. The new bound allows to quantify the level of inconsistency of the setting and shows how learning with inconsistent surrogates can have guarantees on sample complexity and optimization difficulty. We apply our theory to two concrete cases: multi-class classification with the tree-structured loss and ranking with the mean average precision loss. The results show the approximation-computation trade-offs caused by inconsistent surrogates and their potential benefits.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/18/2022

Consistent Polyhedral Surrogates for Top-k Classification and Variants

Top-k classification is a generalization of multiclass classification us...
research
02/05/2019

A General Theory for Structured Prediction with Smooth Convex Surrogates

In this work we provide a theoretical framework for structured predictio...
research
07/24/2018

On the Randomized Complexity of Minimizing a Convex Quadratic Function

Minimizing a convex, quadratic objective is a fundamental problem in mac...
research
03/07/2017

On Structured Prediction Theory with Calibrated Convex Surrogate Losses

We provide novel theoretical insights on structured prediction in the co...
research
06/29/2022

An Embedding Framework for the Design and Analysis of Consistent Polyhedral Surrogates

We formalize and study the natural approach of designing convex surrogat...
research
05/10/2021

Rethinking and Reweighting the Univariate Losses for Multi-Label Ranking: Consistency and Generalization

(Partial) ranking loss is a commonly used evaluation measure for multi-l...
research
08/03/2016

Paraconsistency and Word Puzzles

Word puzzles and the problem of their representations in logic languages...

Please sign up or login with your details

Forgot password? Click here to reset