A Direct Sum Result for the Information Complexity of Learning

04/16/2018
by   Ido Nachum, et al.
0

How many bits of information are required to PAC learn a class of hypotheses of VC dimension d? The mathematical setting we follow is that of Bassily et al. (2018), where the value of interest is the mutual information I(S;A(S)) between the input sample S and the hypothesis outputted by the learning algorithm A. We introduce a class of functions of VC dimension d over the domain X with information complexity at least Ω(d|X|/d) bits for any consistent and proper algorithm (deterministic or random). Bassily et al. proved a similar (but quantitatively weaker) result for the case d=1. The above result is in fact a special case of a more general phenomenon we explore. We define the notion of information complexity of a given class of functions H. Intuitively, it is the minimum amount of information that an algorithm for H must retain about its input to ensure consistency and properness. We prove a direct sum result for information complexity in this context; roughly speaking, the information complexity sums when combining several classes.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/07/2020

Sample-efficient proper PAC learning with approximate differential privacy

In this paper we prove that the sample complexity of properly learning a...
research
11/25/2018

Average-Case Information Complexity of Learning

How many bits of information are revealed by a learning algorithm for a ...
research
04/20/2018

Enumeration in Incremental FPT-Time

In this paper, we study the relationship of parametrised enumeration com...
research
06/27/2022

Finite Littlestone Dimension Implies Finite Information Complexity

We prove that every online learnable class of functions of Littlestone d...
research
02/10/2022

On characterizations of learnability with computable learners

We study computable PAC (CPAC) learning as introduced by Agarwal et al. ...
research
05/23/2018

Tight Bounds for Collaborative PAC Learning via Multiplicative Weights

We study the collaborative PAC learning problem recently proposed in Blu...
research
05/29/2019

Where is the Information in a Deep Neural Network?

Whatever information a Deep Neural Network has gleaned from past data is...

Please sign up or login with your details

Forgot password? Click here to reset