On the Sample Complexity of HGR Maximal Correlation Functions

06/30/2019
by   Shao-Lun Huang, et al.
0

The Hirschfeld-Gebelein-Rényi (HGR) maximal correlation and the corresponding functions have been shown useful in many machine learning scenarios. In this paper, we study the sample complexity of estimating the HGR maximal correlation functions by the alternative conditional expectation (ACE) algorithm from a sequence of training data in the asymptotic regime. Specifically, we develop a mathematical framework to characterize the learning errors between the maximal correlation functions computed from the true distribution, and the functions estimated from the ACE algorithm. For both supervised and semi-supervised learning scenarios, we establish the analytical expressions for the error exponents of the learning errors, which indicate the number of training samples required for estimating the HGR maximal correlation functions by the ACE algorithm. Moreover, with our theoretical results, we investigate the sampling strategy for different types of samples in semi-supervised learning with a total sampling budget constraint, and an optimal sampling strategy is developed to maximize the error exponent of the learning error. Finally, the numerical simulations are presented to support our theoretical results.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset