Hyperlink Regression via Bregman Divergence

07/22/2019
by   Akifumi Okuno, et al.
0

A collection of U (∈N) data vectors is called a U-tuple, and the association strength among the vectors of a tuple is termed as the hyperlink weight, that is assumed to be symmetric with respect to permutation of the entries in the index. We herein propose Bregman hyperlink regression (BHLR), which learns a user-specified symmetric similarity function such that it predicts the tuple's hyperlink weight from data vectors stored in the U-tuple. Nonlinear functions, such as neural networks, can be employed for the similarity function. BHLR is based on Bregman divergence (BD) and encompasses various existing methods such as logistic regression (U=1), Poisson regression (U=1), graph embedding (U=2), matrix factorization (U=2), tensor factorization (U ≥ 2), and their variants equipped with arbitrary BD. We demonstrate that, regardless of the choice of BD and U ∈N, the proposed BHLR is generally (P-1) robust against the distributional misspecification, that is, it asymptotically recovers the underlying true conditional expectation of hyperlink weights given data vectors regardless of its conditional distribution, and (P-2) computationally tractable, that is, it is efficiently computed by stochastic optimization algorithms using a novel generalized minibatch sampling procedure for hyper-relational data. Furthermore, a theoretical guarantee for the optimization is presented. Numerical experiments demonstrate the promising performance of the proposed BHLR.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset