Distributed SMC-PHD Fusion for Partial, Arithmetic Average Consensus

by   Tiancheng Li, et al.

We propose an average consensus approach for distributed SMC-PHD (sequential Monte Carlo-probability hypothesis density) fusion, in which local filters extract Gaussian mixtures (GMs) from their respective particle posteriors, share them (iteratively) with their neighbors and finally use the disseminated GM to update the particle weight. There are two distinguishable features of our approach compared to exiting approaches. First, a computationally efficient particles-to-GM (P2GM) conversion scheme is developed based on the unique structure of the SMC-PHD updater in which the particle weight can be exactly decomposed with regard to the measurements and misdetection. Only significant components of higher weight are utilized for parameterization. The consensus, conditioned on partial information dissemination over the network, is called "partial consensus". Second, importance sampling (IS) is employed to re-weight the local particles for integrating the received GM information, while the states of the particles remain unchanged. By this, the local prior PHD and likelihood calculation can be carried out in parallel to the dissemination & fusion procedure. To assess the effectiveness of the proposed P2GM parameterization approach and IS approach, two relevant yet new distributed SMC-PHD fusion protocols are introduced for comparison. One uses the same P2GM conversion and GM dissemination schemes as our approach but local particles are regenerated from the disseminated GMs at each filtering iteration - in place of the IS approach. This performs similar to our IS approach (as expected) but prevents any parallelization as addressed above. The other is disseminating the particles between neighbors - in place of the P2GM conversion. This avoids parameterization but is communicatively costly. The state-of-the-art exponential mixture density approach is also realized for comparison.


page 1

page 2

page 3

page 4


Partial Consensus and Conservative Fusion of Gaussian Mixtures for Distributed PHD Fusion

We propose a novel consensus notion, called "partial consensus", for dis...

Exponential Natural Particle Filter

Particle Filter algorithm (PF) suffers from some problems such as the lo...

The one step fixed-lag particle smoother as a strategy to improve the prediction step of particle filtering

Sequential Monte Carlo methods have been a major breakthrough in the fie...

Approximate Shannon Sampling in Importance Sampling: Nearly Consistent Finite Particle Estimates

In Bayesian inference, we seek to compute information about random varia...

Subsampling Sequential Monte Carlo for Static Bayesian Models

Our article shows how to carry out Bayesian inference by combining data ...

An Adaptive Resample-Move Algorithm for Estimating Normalizing Constants

The estimation of normalizing constants is a fundamental step in probabi...

Multiple feature fusion-based video face tracking for IoT big data

With the advancement of IoT and artificial intelligence technologies, an...

Please sign up or login with your details

Forgot password? Click here to reset