language-icon Old Web
English
Sign In

Interaction information

The interaction information (McGill 1954), or amounts of information (Hu Kuo Ting, 1962) or co-information (Bell 2003), is one of several generalizations of the mutual information. The interaction information (McGill 1954), or amounts of information (Hu Kuo Ting, 1962) or co-information (Bell 2003), is one of several generalizations of the mutual information. Interaction information expresses the amount information (redundancy or synergy) bound up in a set of variables, beyond that which is present in any subset of those variables. Unlike the mutual information, the interaction information can be either positive or negative. This confusing property has likely retarded its wider adoption as an information measure in machine learning and cognitive science. These functions, their negativity and minima have a direct interpretation in algebraic topology (Baudot & Bennequin, 2015). For three variables { X , Y , Z } {displaystyle {X,Y,Z}} , the interaction information I ( X ; Y ; Z ) {displaystyle I(X;Y;Z)} is given by where, for example, I ( X ; Y ) {displaystyle I(X;Y)} is the mutual information between variables X {displaystyle X} and Y {displaystyle Y} , and I ( X ; Y | Z ) {displaystyle I(X;Y|Z)} is the conditional mutual information between variables X {displaystyle X} and Y {displaystyle Y} given Z {displaystyle Z} . Formally,

[ "Statistics", "Human–computer interaction", "Machine learning", "Data mining", "Artificial intelligence", "Variation of information" ]
Parent Topic
Child Topic
    No Parent Topic