Gaussian Signalling for Covert Communications
In this work, we examine the optimality of Gaussian signalling for covert communications with an upper bound on D(p__1(y)||p__0(y)) or D(p__0(y)||p__1(y)) as the covert communication constraint, where D(p__1(y)||p__0(y)) and D(p__0(y)||p__1(y)) are different due to the asymmetry of Kullback-Leibler divergence, p__0(y) and p__1(y) are the likelihood functions of the observation y at the warden under the null hypothesis (no covert transmission) and alternative hypothesis (a covert transmission occurs), respectively. Considering additive white Gaussian noise at both the receiver and the warden, we prove that Gaussian signalling is optimal in terms of maximizing the mutual information of transmitted and received signals for covert communications with D(p__1(y)||p__0(y)) ≤ 2ϵ^2 as the constraint. More interestingly, we also prove that Gaussian signalling is not optimal for covert communications with an upper bound on D(p__0(y)||p__1(y)) as the constraint, for which as we explicitly show a skew-normal signalling can outperform Gaussian signalling in terms of achieving higher mutual information. Finally, we prove that, for Gaussian signalling, an upper bound on D(p__1(y)||p__0(y)) is a tighter constraint in terms of leading to lower mutual information than the same upper bound on D(p__0(y)||p__1(y)), by proving D(p__0(y)||p__1(y)) ≤D(p__1(y)||p__0(y)).
READ FULL TEXT