To refer to this page use:
|Abstract:||We use self-reduction methods to prove strong information lower bounds on two of the most studied functions in the communication complexity literature: Gap Hamming Distance (GHD) and Inner Product (IP). In our first result we affirm the conjecture that the information cost of GHD : is linear even under the uniform distribution, which strengthens the Ω(n) bound recently shown by , and answering an open problem from . In our second result we prove that the information cost of IP n is arbitrarily close to the trivial upper bound n as the permitted error tends to zero, again strengthening the Ω(n) lower bound recently proved by . Our proofs demonstrate that self-reducibility makes the connection between information complexity and communication complexity lower bounds a two-way connection. Whereas numerous results in the past [13,2,3] used information complexity techniques to derive new communication complexity lower bounds, we explore a generic way in which communication complexity lower bounds imply information complexity lower bounds in a black-box manner.|
|Citation:||Braverman, Mark, Ankit Garg, Denis Pankratov, and Omri Weinstein. "Information Lower Bounds via Self-reducibility." In International Computer Science Symposium in Russia (2013): pp. 183-194. doi:10.1007/978-3-642-38536-0_16|
|Pages:||183 - 194|
|Type of Material:||Conference Article|
|Series/Report no.:||Lecture Notes in Computer Science;|
|Journal/Proceeding Title:||Computer Science – Theory and Applications|
Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.