Skip to main content

A new entropy power inequality for integer-valued random variables

Author(s): Haghighatshoar, S; Abbe, Emmanuel; Telatar, IE

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1t56c
Abstract: The entropy power inequality (EPI) yields lower bounds on the differential entropy of the sum of two independent real-valued random variables in terms of the individual entropies. Versions of the EPI for discrete random variables have been obtained for special families of distributions with the differential entropy replaced by the discrete entropy, but no universal inequality is known (beyond trivial ones). More recently, the sumset theory for the entropy function yields a sharp inequality H(X+X′)-H(X)≥1/2-o(1) when X + X′ are independent identically distributed (i.i.d.) with high entropy. This paper provides the inequality H(X+X′)-H(X)≥ g(H(X)) , where X + X′ are arbitrary i.i.d. integer-valued random variables and where g is a universal strictly positive function on ℝ+ satisfying g(0)=0. Extensions to nonidentically distributed random variables and to conditional entropies are also obtained
Publication Date: 2014
Citation: Haghighatshoar, S, Abbe, E, Telatar, IE. (2014). A new entropy power inequality for integer-valued random variables. IEEE Transactions on Information Theory, 60 (3787 - 3796. doi:10.1109/TIT.2014.2317181
DOI: doi:10.1109/TIT.2014.2317181
Pages: 3787 - 3796
Type of Material: Journal Article
Journal/Proceeding Title: IEEE Transactions on Information Theory
Version: Author's manuscript



Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.