To refer to this page use:
|Abstract:||We prove a new exponential concentration inequality for a plug-in estimator of the Shannon mutual information. Previous results on mutual information estimation only bounded expected error. The advantage of having the exponential inequality is that, combined with the union bound, we can guarantee accurate estimators of the mutual information for many pairs of random variables simultaneously. As an application, we show how to use such a result to optimally estimate the density function and graph of a distribution which is Markov to a forest graph.|
|Citation:||Liu, Han, Larry Wasserman, and John D. Lafferty. "Exponential concentration for mutual information estimation with application to forests." In Advances in Neural Information Processing Systems, (2012): pp. 2537-2545.|
|Pages:||2537 - 2545|
|Type of Material:||Conference Article|
|Journal/Proceeding Title:||Advances in Neural Information Processing Systems|
|Version:||Final published version. Article is made available in OAR by the publisher's permission or policy.|
Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.