Information Decomposition and Synergy
2015
Recently, a series of papers addressed the problem of decomposing the information of two random variables into shared information, unique information and synergistic information. Several measures were proposed, although still no consensus has been reached. Here, we compare these proposals with an older approach to define synergistic information based on the projections on exponential families containing only up to k-th order interactions. We show that these measures are not compatible with a decomposition into unique, shared and synergistic information if one requires that all terms are always non-negative (local positivity). We illustrate the difference between the two measures for multivariate Gaussians.
Keywords:
- Multivariate statistics
- Multivariate mutual information
- Pointwise mutual information
- Entropy (information theory)
- Exponential family
- Mutual information
- Mathematics
- Machine learning
- Variation of information
- Interaction information
- Artificial intelligence
- Random variable
- Theoretical computer science
- Computer science
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
24
References
68
Citations
NaN
KQI