Skip to main content

Information complexity is computable

Author(s): Braverman, Mark; Schneider, J

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1838x
Abstract: The information complexity of a function f is the minimum amount of information Alice and Bob need to exchange to compute the function f. In this paper we provide an algorithm for approximating the information complexity of an arbitrary function f to within any additive error epsilon > 0, thus resolving an open question as to whether information complexity is computable. In the process, we give the first explicit upper bound on the rate of convergence of the information complexity of f when restricted to b-bit protocols to the (unrestricted) information complexity of f.
Publication Date: 2016
Electronic Publication Date: 2016
Citation: Braverman, M, Schneider, J. (2016). Information complexity is computable. 55 (10.4230/LIPIcs.ICALP.2016.87
DOI: doi:10.4230/LIPIcs.ICALP.2016.87
Type of Material: Conference Article
Journal/Proceeding Title: 43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016)
Version: Author's manuscript



Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.