Nonconvex Low-Rank Symmetric Tensor Completion from Noisy Data

11/11/2019
by   Changxiao Cai, et al.
0

We study a noisy symmetric tensor completion problem of broad practical interest, namely, the reconstruction of a low-rank symmetric tensor from highly incomplete and randomly corrupted observations of its entries. While a variety of prior work has been dedicated to this problem, prior algorithms either are computationally too expensive for large-scale applications, or come with sub-optimal statistical guarantees. Focusing on "incoherent" and well-conditioned tensors of a constant CP rank, we propose a two-stage nonconvex algorithm — (vanilla) gradient descent following a rough initialization — that achieves the best of both worlds. Specifically, the proposed nonconvex algorithm faithfully completes the tensor and retrieves all individual tensor factors within nearly linear time, while at the same time enjoying near-optimal statistical guarantees (i.e. minimal sample complexity and optimal estimation accuracy). The estimation errors are evenly spread out across all entries, thus achieving optimal ℓ_∞ statistical accuracy. The insight conveyed through our analysis of nonconvex optimization might have implications for other tensor estimation problems.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset