Fast Projected Newton-like Method for Precision Matrix Estimation with Nonnegative Partial Correlations

by   Jiaxi Ying, et al.

We study the problem of estimating precision matrices in multivariate Gaussian distributions where all partial correlations are nonnegative, also known as multivariate totally positive of order two (MTP_2). Such models have received significant attention in recent years, primarily due to interesting properties, e.g., the maximum likelihood estimator exists with as few as two observations regardless of the underlying dimension. We formulate this problem as a weighted ℓ_1-norm regularized Gaussian maximum likelihood estimation under MTP_2 constraints. On this direction, we propose a novel projected Newton-like algorithm that incorporates a well-designed approximate Newton direction, which results in our algorithm having the same orders of computation and memory costs as those of first-order methods. We prove that the proposed projected Newton-like algorithm converges to the minimizer of the problem. We further show, both theoretically and experimentally, that the minimizer of our formulation using the weighted ℓ_1-norm is able to recover the support of the underlying precision matrix correctly without requiring the incoherence condition present in ℓ_1-norm based methods. Experiments involving synthetic and real-world data demonstrate that our proposed algorithm is significantly more efficient, from a computational time perspective, than the state-of-the-art methods. Finally, we apply our method in financial time-series data, which are well-known for displaying positive dependencies, where we observe a significant performance in terms of modularity value on the learned financial networks.


page 1

page 2

page 3

page 4


Adaptive Estimation of MTP_2 Graphical Models

We consider the problem of estimating (diagonally dominant) M-matrices a...

Covariance estimation with nonnegative partial correlations

We study the problem of high-dimensional covariance estimation under the...

Fast Projection onto the Capped Simplex withApplications to Sparse Regression in Bioinformatics

We consider the problem of projecting a vector onto the so-called k-capp...

Sparse Inverse Covariance Matrix Estimation Using Quadratic Approximation

The L1-regularized Gaussian maximum likelihood estimator (MLE) has been ...

Sparse Nonnegative Tucker Decomposition and Completion under Noisy Observations

Tensor decomposition is a powerful tool for extracting physically meanin...

The Projected Power Method: An Efficient Algorithm for Joint Alignment from Pairwise Differences

Various applications involve assigning discrete label values to a collec...

Efficient Graph Laplacian Estimation by a Proximal Newton Approach

The Laplacian-constrained Gaussian Markov Random Field (LGMRF) is a comm...

Please sign up or login with your details

Forgot password? Click here to reset