Maximum entropy from the laws of probability

2001 
A new derivation is presented of maximum entropy, which is an extremizing principle for assigning probability distributions from expectation values. The additive form ΣiΦ(pi) for the maximand is first proved by requiring that, when some probabilities are given, the procedure for finding the remaining probabilities should not depend on the values of the given probabilities. This condition induces functional equations whose solution generates the additive form. To find the function Φ we assign two distributions in separate spaces from separate expectation values; then assign a joint distribution by taking these same values to be expectations of its marginals; then require these marginals to be the same as the separately assigned distributions. The resulting functional equations have only one viable solution—the entropic form Φ(z)=−z ln z. The exploitation of marginal distributions is due to Shore and Johnson [1], but the present derivation uses weaker axioms that require only consistency with the sum and pr...
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    11
    References
    7
    Citations
    NaN
    KQI
    []