Information-theoretic approximations of the nonnegative rank

Publisher:
Springer
Publication Type:
Journal Article
Citation:
Computational Complexity, 2017, 26, (1), pp. 147-197
Issue Date:
2017-03-01
Filename Description Size
s00037-016-0125-z.pdf765.75 kB
Adobe PDF
Full metadata record
Common information was introduced by Wyner (IEEE Trans Inf Theory 21(2):163–179, 1975) as a measure of dependence of two random variables. This measure has been recently resurrected as a lower bound on the logarithm of the nonnegative rank of a nonnegative matrix in Braun and Pokutta (Proceedings of FOCS, 2013) and Jain et al. (Proceedings of SODA, 2013). Lower bounds on nonnegative rank have important applications to several areas such as communication complexity and combinatorial optimization. We begin a systematic study of common information extending the dual characterization of Witsenhausen (SIAM J Appl Math 31(2):313–333, 1976). Our main results are: (i) Common information is additive under tensoring of matrices. (ii) It characterizes the (logarithm of the) amortized nonnegative rank of a matrix, i.e., the minimal nonnegative rank under tensoring and small ℓ1 perturbations. We also provide quantitative bounds generalizing previous asymptotic results by Wyner (IEEE Trans Inf Theory 21(2):163–179, 1975). (iii) We deliver explicit witnesses from the dual problem for several matrices leading to explicit lower bounds on common information, which are robust under ℓ1 perturbations. This includes improved lower bounds for perturbations of the all important unique disjointness partial matrix, as well as new insights into its information-theoretic structure.
Please use this identifier to cite or link to this item: