TY - GEN
T1 - Trading information complexity for error
AU - Dagan, Yuval
AU - Filmus, Yuval
AU - Hatami, Hamed
AU - Li, Yaqiao
N1 - Publisher Copyright:
© Yuval Dagan, Yuval Filmus, Hamed Hatami, and Yaqiao Li.
PY - 2017/7/1
Y1 - 2017/7/1
N2 - We consider the standard two-party communication model. The central problem studied in this article is how much can one save in information complexity by allowing an error of ϵ. For arbitrary functions, we obtain lower bounds and upper bounds indicating a gain that is of order (h(ϵ)) and O(h(p ϵ)). Here h denotes the binary entropy function. We analyze the case of the two-bit AND function in detail to show that for this function the gain is O(h(ϵ)). This answers a question of Braverman et al. [4]. We obtain sharp bounds for the set disjointness function of order n. For the case of the distributional error, we introduce a new protocol that achieves a gain of O( p h(ϵ)) provided that n is sufficiently large. We apply these results to answer another of question of Braverman et al. regarding the randomized communication complexity of the set disjointness function. Answering a question of Braverman [3], we apply our analysis of the set disjointness function to establish a gap between the two different notions of the prior-free information cost. In light of [3], this implies that amortized randomized communication complexity is not necessarily equal to the amortized distributional communication complexity with respect to the hardest distribution. As a consequence, we show that the ϵ-error randomized communication complexity of the set disjointness function of order n is n[CDISJ-O(h(ϵ))]+o(n), where CDISJ 0.4827 is the constant found by Braverman et al. [4].
AB - We consider the standard two-party communication model. The central problem studied in this article is how much can one save in information complexity by allowing an error of ϵ. For arbitrary functions, we obtain lower bounds and upper bounds indicating a gain that is of order (h(ϵ)) and O(h(p ϵ)). Here h denotes the binary entropy function. We analyze the case of the two-bit AND function in detail to show that for this function the gain is O(h(ϵ)). This answers a question of Braverman et al. [4]. We obtain sharp bounds for the set disjointness function of order n. For the case of the distributional error, we introduce a new protocol that achieves a gain of O( p h(ϵ)) provided that n is sufficiently large. We apply these results to answer another of question of Braverman et al. regarding the randomized communication complexity of the set disjointness function. Answering a question of Braverman [3], we apply our analysis of the set disjointness function to establish a gap between the two different notions of the prior-free information cost. In light of [3], this implies that amortized randomized communication complexity is not necessarily equal to the amortized distributional communication complexity with respect to the hardest distribution. As a consequence, we show that the ϵ-error randomized communication complexity of the set disjointness function of order n is n[CDISJ-O(h(ϵ))]+o(n), where CDISJ 0.4827 is the constant found by Braverman et al. [4].
KW - Communication complexity
KW - Information complexity
UR - http://www.scopus.com/inward/record.url?scp=85028769709&partnerID=8YFLogxK
U2 - 10.4230/LIPIcs.CCC.2017.16
DO - 10.4230/LIPIcs.CCC.2017.16
M3 - ???researchoutput.researchoutputtypes.contributiontobookanthology.conference???
AN - SCOPUS:85028769709
T3 - Leibniz International Proceedings in Informatics, LIPIcs
BT - 32nd Computational Complexity Conference, CCC 2017
A2 - O'Donnell, Ryan
T2 - 32nd Computational Complexity Conference, CCC 2017
Y2 - 6 July 2017 through 9 July 2017
ER -