Federated Bayesian Computation via Piecewise Deterministic Markov Processes
When performing Bayesian computations in practice, one is often faced with the challenge that the constituent model components and/or the data are only available in a distributed fashion, e.g. due to privacy concerns or sheer volume. While various methods have been proposed for performing posterior inference in such federated settings, these either make very strong assumptions on the data and/or model or otherwise introduce significant bias when the local posteriors are combined to form an approximation of the target posterior. By leveraging recently developed methods for Markov Chain Monte Carlo (MCMC) based on Piecewise Deterministic Markov Processes (PDMPs), we develop a computation – and communication – efficient family of posterior inference algorithms (Fed-PDMC) which provides asymptotically exact approximations of the full posterior over a large class of Bayesian models, allowing heterogenous model and data contributions from each client. We show that communication between clients and the server preserves the privacy of the individual data sources by establishing differential privacy guarantees. We quantify the performance of Fed-PDMC over a class of illustrative analytical case-studies and demonstrate its efficacy on a number of synthetic examples along with realistic Bayesian computation benchmarks.
READ FULL TEXT