Ternary Representation of Stochastic Change and the Origin of Entropy and Its Fluctuations

02/25/2019
by   Hong Qian, et al.
0

A change in a stochastic system has three representations: Probabilistic, statistical, and informational: (i) is based on random variable u(ω)→ũ(ω); this induces (ii) the probability distributions F_u(x)→ F_ũ(x), x∈R^n; and (iii) a change in the probability measure P→P̃ under the same observable u(ω). In the informational representation a change is quantified by the Radon-Nikodym derivative ( d P̃/ dP(ω))=-( d F_u/ d F_ũ(x)) when x=u(ω). Substituting a random variable into its own density function creates a fluctuating entropy whose expectation has been given by Shannon. Informational representation of a deterministic transformation on R^n reveals entropic and energetic terms, and the notions of configurational entropy of Boltzmann and Gibbs, and potential of mean force of Kirkwood. Mutual information arises for correlated u(ω) and ũ(ω); and a nonequilibrium thermodynamic entropy balance equation is identified.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset