Skip to main content

On the Minimum Mean $p$ th Error in Gaussian Noise Channels and Its Applications

Author(s): Dytso, Alex; Bustin, Ronit; Tuninetti, Daniela; Devroye, Natasha; Poor, H Vincent; et al

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr19z90c10
Full metadata record
DC FieldValueLanguage
dc.contributor.authorDytso, Alex-
dc.contributor.authorBustin, Ronit-
dc.contributor.authorTuninetti, Daniela-
dc.contributor.authorDevroye, Natasha-
dc.contributor.authorPoor, H Vincent-
dc.contributor.authorShamai Shitz, Shlomo-
dc.date.accessioned2024-02-03T03:41:52Z-
dc.date.available2024-02-03T03:41:52Z-
dc.date.issued2017-12-13en_US
dc.identifier.citationDytso, Alex, Bustin, Ronit, Tuninetti, Daniela, Devroye, Natasha, Poor, H Vincent, Shamai Shitz, Shlomo. (2018). On the Minimum Mean $p$ th Error in Gaussian Noise Channels and Its Applications. IEEE Transactions on Information Theory, 64 (3), 2012 - 2037. doi:10.1109/tit.2017.2782786en_US
dc.identifier.issn0018-9448-
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/pr19z90c10-
dc.description.abstractThe problem of estimating an arbitrary random vector from its observation corrupted by additive white Gaussian noise, where the cost function is taken to be the minimum mean pth error (MMPE), is considered. The classical minimum mean square error (MMSE) is a special case of the MMPE. Several bounds, properties, and applications of the MMPE are derived and discussed. The optimal MMPE estimator is found for Gaussian and binary input distributions. Properties of the MMPE as a function of the input distribution, signal-to-noiseratio (SNR) and order p are derived. The “single-crossing-point property” (SCPP) which provides an upper bound on the MMSE, and which together with the mutual information-MMSE relationship is a powerful tool in deriving converse proofs in multiuser information theory, is extended to the MMPE. Moreover, a complementary bound to the SCPP is derived. As a first application of the MMPE, a bound on the conditional differential entropy in terms of the MMPE is provided, which then yields a generalization of the Ozarow-Wyner lower bound on the mutual information achieved by a discrete input on a Gaussian noise channel. As a second application, the MMPE is shown to improve on previous characterizations of the phase transition phenomenon that manifests, in the limit as the length of the capacity achieving code goes to infinity, as a discontinuity of the MMSE as a function of SNR. As a final application, the MMPE is used to show new bounds on the second derivative of mutual information, or the first derivative of the MMSE.en_US
dc.format.extent2012 - 2037en_US
dc.language.isoen_USen_US
dc.relation.ispartofIEEE Transactions on Information Theoryen_US
dc.rightsAuthor's manuscripten_US
dc.titleOn the Minimum Mean $p$ th Error in Gaussian Noise Channels and Its Applicationsen_US
dc.typeJournal Articleen_US
dc.identifier.doidoi:10.1109/tit.2017.2782786-
dc.identifier.eissn1557-9654-
pu.type.symplectichttp://www.symplectic.co.uk/publications/atom-terms/1.0/journal-articleen_US

Files in This Item:
File Description SizeFormat 
1607.01461.pdf864.05 kBAdobe PDFView/Download


Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.