Skip to main content

Information Lower Bounds via Self-reducibility

Author(s): Braverman, Mark; Garg, Ankit; Pankratov, Denis; Weinstein, Omri

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1v826
Abstract: We use self-reduction methods to prove strong information lower bounds on two of the most studied functions in the communication complexity literature: Gap Hamming Distance (GHD) and Inner Product (IP). In our first result we affirm the conjecture that the information cost of GHD : is linear even under the uniform distribution, which strengthens the Ω(n) bound recently shown by [15], and answering an open problem from [10]. In our second result we prove that the information cost of IP n is arbitrarily close to the trivial upper bound n as the permitted error tends to zero, again strengthening the Ω(n) lower bound recently proved by [9]. Our proofs demonstrate that self-reducibility makes the connection between information complexity and communication complexity lower bounds a two-way connection. Whereas numerous results in the past [13,2,3] used information complexity techniques to derive new communication complexity lower bounds, we explore a generic way in which communication complexity lower bounds imply information complexity lower bounds in a black-box manner.
Publication Date: 2013
Citation: Braverman, Mark, Ankit Garg, Denis Pankratov, and Omri Weinstein. "Information Lower Bounds via Self-reducibility." In International Computer Science Symposium in Russia (2013): pp. 183-194. doi:10.1007/978-3-642-38536-0_16
DOI: 10.1007/978-3-642-38536-0_16
Pages: 183 - 194
Type of Material: Conference Article
Series/Report no.: Lecture Notes in Computer Science;
Journal/Proceeding Title: Computer Science – Theory and Applications
Version: Author's manuscript



Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.