A Discrepancy Lower Bound for Information Complexity
Author(s): Braverman, Mark; Weinstein, Omri
DownloadTo refer to this page use:
http://arks.princeton.edu/ark:/88435/pr1025j
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Braverman, Mark | - |
dc.contributor.author | Weinstein, Omri | - |
dc.date.accessioned | 2021-10-08T19:44:43Z | - |
dc.date.available | 2021-10-08T19:44:43Z | - |
dc.date.issued | 2012 | en_US |
dc.identifier.citation | Braverman, Mark, and Omri Weinstein. "A Discrepancy Lower Bound for Information Complexity." Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (2012): 459-470. doi:10.1007/978-3-642-32512-0_39 | en_US |
dc.identifier.issn | 0302-9743 | - |
dc.identifier.uri | https://arxiv.org/pdf/1112.2000.pdf | - |
dc.identifier.uri | http://arks.princeton.edu/ark:/88435/pr1025j | - |
dc.description.abstract | This paper provides the first general technique for proving information lower bounds on two-party unbounded-rounds communication problems. We show that the discrepancy lower bound, which applies to randomized communication complexity, also applies to information complexity. More precisely, if the discrepancy of a two-party function f with respect to a distribution μ is Disc μ f, then any two party randomized protocol computing f must reveal at least Ω(log(1/Disc μ f)) bits of information to the participants. As a corollary, we obtain that any two-party protocol for computing a random function on {0,1} n ×{0,1} n must reveal Ω(n) bits of information to the participants. In addition, we prove that the discrepancy of the Greater-Than function is Ω(1/𝑛√) , which provides an alternative proof to the recent proof of Viola [Vio11] of the Ω(logn) lower bound on the communication complexity of this well-studied function and, combined with our main result, proves the tight Ω(logn) lower bound on its information complexity. The proof of our main result develops a new simulation procedure that may be of an independent interest. In a very recent breakthrough work of Kerenidis et al. [KLL+12], this simulation procedure was a building block towards a proof that almost all known lower bound techniques for communication complexity (and not just discrepancy) apply to information complexity. | en_US |
dc.format.extent | 459 - 470 | en_US |
dc.language.iso | en_US | en_US |
dc.relation.ispartof | Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques. | en_US |
dc.relation.ispartofseries | Lecture Notes in Computer Science; | - |
dc.rights | Author's manuscript | en_US |
dc.title | A Discrepancy Lower Bound for Information Complexity | en_US |
dc.type | Conference Article | en_US |
dc.identifier.doi | 10.1007/978-3-642-32512-0_39 | - |
pu.type.symplectic | http://www.symplectic.co.uk/publications/atom-terms/1.0/conference-proceeding | en_US |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
DiscrepancyLowerBoundInformationComplexity.pdf | 255.2 kB | Adobe PDF | View/Download |
Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.