A differentiable measure of pointwise shared information.

2020 
Partial information decomposition (PID) of the multivariate mutual information describes the distinct ways in which a set of source variables contains information about a target variable. The groundbreaking work of Williams and Beer has shown that this decomposition can not be determined from classic information theory without making additional assumptions, and several candidate measures have been proposed, often drawing on principles from related fields such as decision theory. None of these measures is differentiable with respect to the underlying probability mass function. We here present a novel measure that draws only on principles linking the local mutual information to exclusion of probability mass. This principle is foundational to the original definition of the mutual information by Fano. We reuse this principle to define measure of shared information that is differentiable and is well-defined for individual realizations of the random variables. We show that the measure can be interpreted as a local mutual information with the help of an auxiliary variable. We also show that it has a meaningful Moebius inversion on a redundancy lattice and obeys a target chain rule. We give an operational interpretation of the measure based on the decisions an agent should take if given only the shared information.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    35
    References
    7
    Citations
    NaN
    KQI
    []