Skip to main content

Information Lower Bounds via Self-Reducibility

Author(s): Braverman, Mark; Garg, Ankit; Pankratov, Denis; Weinstein, Omri

To refer to this page use:
Abstract: We use self-reduction methods to prove strong information lower bounds on two of the most studied functions in the communication complexity literature: Gap Hamming Distance (GHD) and Inner Product (IP). In our first result we affirm the conjecture that the information cost of GHD is linear even under the uniform distribution, which strengthens the Ω(n) bound recently shown by Kerenidis et al. (2012), and answers an open problem from Chakrabarti et al. (2012). In our second result we prove that the information cost of IPn is arbitrarily close to the trivial upper bound n as the permitted error tends to zero, again strengthening the Ω(n) lower bound recently proved by Braverman and Weinstein (Electronic Colloquium on Computational Complexity (ECCC) 18, 164 2011). Our proofs demonstrate that self-reducibility makes the connection between information complexity and communication complexity lower bounds a two-way connection. Whereas numerous results in the past (Chakrabarti et al. 2001; Bar-Yossef et al. J. Comput. Syst. Sci. 68(4), 702---732 2004; Barak et al. 2010) used information complexity techniques to derive new communication complexity lower bounds, we explore a generic way in which communication complexity lower bounds imply information complexity lower bounds in a black-box manner.
Publication Date: 2016
Citation: Braverman, Mark, Ankit Garg, Denis Pankratov, and Omri Weinstein. "Information Lower Bounds via Self-Reducibility." Theory of Computing Systems 59, no. 2 (2016): 377-396. doi:10.1007/s00224-015-9655-z
DOI: 10.1007/s00224-015-9655-z
ISSN: 1432-4350
Pages: 377 - 396
Type of Material: Journal Article
Journal/Proceeding Title: Theory of Computing Systems
Version: Author's manuscript

Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.