Random matrices in service of ML footprint: ternary random features with no performance loss

10/05/2021
βˆ™
by   Hafiz Tiomoko Ali, et al.
βˆ™
0
βˆ™

In this article, we investigate the spectral behavior of random features kernel matrices of the type K = 𝔼_ w[Οƒ( w^ T x_i)Οƒ( w^ T x_j)]_i,j=1^n, with nonlinear function Οƒ(Β·), data x_1, …, x_n βˆˆβ„^p, and random projection vector wβˆˆβ„^p having i.i.d. entries. In a high-dimensional setting where the number of data n and their dimension p are both large and comparable, we show, under a Gaussian mixture model for the data, that the eigenspectrum of K is independent of the distribution of the i.i.d.(zero-mean and unit-variance) entries of w, and only depends on Οƒ(Β·) via its (generalized) Gaussian moments 𝔼_zβˆΌπ’©(0,1)[Οƒ'(z)] and 𝔼_zβˆΌπ’©(0,1)[σ”(z)]. As a result, for any kernel matrix K of the form above, we propose a novel random features technique, called Ternary Random Feature (TRF), that (i) asymptotically yields the same limiting kernel as the original K in a spectral sense and (ii) can be computed and stored much more efficiently, by wisely tuning (in a data-dependent manner) the function Οƒ and the random vector w, both taking values in {-1,0,1}. The computation of the proposed random features requires no multiplication, and a factor of b times less bits for storage compared to classical random features such as random Fourier features, with b the number of bits to store full precision values. Besides, it appears in our experiments on real data that the substantial gains in computation and storage are accompanied with somewhat improved performances compared to state-of-the-art random features compression/quantization methods.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset