Information-theoretic approximations of the nonnegative rank

2017 
Common information was introduced by Wyner (IEEE Trans Inf Theory 21(2):163---179, 1975) as a measure of dependence of two random variables. This measure has been recently resurrected as a lower bound on the logarithm of the nonnegative rank of a nonnegative matrix in Braun and Pokutta (Proceedings of FOCS, 2013) and Jain et al. (Proceedings of SODA, 2013). Lower bounds on nonnegative rank have important applications to several areas such as communication complexity and combinatorial optimization. We begin a systematic study of common information extending the dual characterization of Witsenhausen (SIAM J Appl Math 31(2):313---333, 1976). Our main results are: (i) Common information is additive under tensoring of matrices. (ii) It characterizes the (logarithm of the) amortized nonnegative rank of a matrix, i.e., the minimal nonnegative rank under tensoring and small $${\ell_1}$$l1 perturbations. We also provide quantitative bounds generalizing previous asymptotic results by Wyner (IEEE Trans Inf Theory 21(2):163---179, 1975). (iii) We deliver explicit witnesses from the dual problem for several matrices leading to explicit lower bounds on common information, which are robust under $${\ell_1}$$l1 perturbations. This includes improved lower bounds for perturbations of the all important unique disjointness partial matrix, as well as new insights into its information-theoretic structure.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    34
    References
    14
    Citations
    NaN
    KQI
    []