Sum-of-squares chordal decomposition of polynomial matrix inequalities

07/22/2020
by   Yang Zheng, et al.
0

We prove three decomposition results for sparse positive (semi-)definite polynomial matrices. First, we show that a polynomial matrix P(x) with chordal sparsity is positive semidefinite for all x∈ℝ^n if and only if there exists a sum-of-squares (SOS) polynomial σ(x) such that σ(x)P(x) can be decomposed into a sum of sparse SOS matrices, each of which is zero outside a small principal submatrix. Second, we establish that setting σ(x)=(x_1^2 + ⋯ + x_n^2)^ν for some integer ν suffices if P(x) is even, homogeneous, and positive definite. Third, we prove a sparse-matrix version of Putinar's Positivstellensatz: if P(x) has chordal sparsity and is positive definite on a compact semialgebraic set 𝒦={x:g_1(x)≥ 0,…,g_m(x)≥ 0} satisfying the Archimedean condition, then P(x) = S_0(x) + g_1(x)S_1(x) + ⋯ + g_m(x)S_m(x) for matrices S_i(x) that are sums of sparse SOS matrices, each of which is zero outside a small principal submatrix. Using these decomposition results, we obtain sparse SOS representation theorems for polynomials that are quadratic and correlatively sparse in a subset of variables. We also obtain new convergent hierarchies of sparsity-exploiting SOS reformulations to convex optimization problems with large and sparse polynomial matrix inequalities. Analytical examples illustrate all our decomposition results, while large-scale numerical examples demonstrate that the corresponding sparsity-exploiting SOS hierarchies have significantly lower computational complexity than traditional ones.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro