Proving Expected Sensitivity of Probabilistic Programs
Program sensitivity, also known as Lipschitz continuity, describes how small changes in a program's input lead to bounded changes in the output. We propose an average notion of program sensitivity for probabilistic programs---expected sensitivity---that averages a distance function over a probabilistic coupling of two output distributions from two similar inputs. By varying the distance, expected sensitivity recovers useful notions of probabilistic function sensitivity, including stability of machine learning algorithms and convergence of Markov chains. Furthermore, expected sensitivity satisfies clean compositional properties and is amenable to formal verification. We develop a relational program logic called EpRHL for proving expected sensitivity properties. Our logic features two key ideas. First, relational pre-conditions and post-conditions are expressed using distances, a real-valued generalization of typical boolean-valued (relational) assertions. Second, judgments are interpreted in terms of expectation coupling, a novel, quantitative generalization of probabilistic couplings which supports compositional reasoning. We demonstrate our logic on examples beyond the reach of prior relational logics. Our main example formalizes uniform stability of the stochastic gradient method. Furthermore, we prove rapid mixing for a probabilistic model of population dynamics. We also extend our logic with a transitivity principle for expectation couplings to capture the path coupling proof technique by Bubley and Dyer, and formalize rapid mixing of the Glauber dynamics from statistical physics.
READ FULL TEXT