Subspace Embedding and Linear Regression with Orlicz Norm

06/17/2018
by   Alexandr Andoni, et al.
0

We consider a generalization of the classic linear regression problem to the case when the loss is an Orlicz norm. An Orlicz norm is parameterized by a non-negative convex function G:R_+→R_+ with G(0)=0: the Orlicz norm of a vector x∈R^n is defined as x_G={α>0|∑_i=1^n G(|x_i|/α)≤ 1}. We consider the cases where the function G(·) grows subquadratically. Our main result is based on a new oblivious embedding which embeds the column space of a given matrix A∈R^n× d with Orlicz norm into a lower dimensional space with ℓ_2 norm. Specifically, we show how to efficiently find an embedding matrix S∈R^m× n,m<n such that ∀ x∈R^d,Ω(1/(d n)) ·Ax_G≤SAx_2≤ O(d^2 n) ·Ax_G. By applying this subspace embedding technique, we show an approximation algorithm for the regression problem _x∈R^dAx-b_G, up to a O(d^2 n) factor. As a further application of our techniques, we show how to also use them to improve on the algorithm for the ℓ_p low rank matrix approximation problem for 1≤ p<2.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset