Relative entropy at the channel output of a capacity-achieving code

In this paper we establish a new inequality tying together the coding rate, the probability of error and the relative entropy between the channel and the auxiliary output distribution. This inequality is then used to show the strong converse, and to prove that the output distribution of a code must...

Full description

Bibliographic Details
Main Authors: Polyanskiy, Yury, Verdu, Sergio
Other Authors: Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
Format: Article
Language:en_US
Published: Institute of Electrical and Electronics Engineers (IEEE) 2013
Online Access:http://hdl.handle.net/1721.1/79672
https://orcid.org/0000-0002-2109-0979
_version_ 1826198474616471552
author Polyanskiy, Yury
Verdu, Sergio
author2 Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
author_facet Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
Polyanskiy, Yury
Verdu, Sergio
author_sort Polyanskiy, Yury
collection MIT
description In this paper we establish a new inequality tying together the coding rate, the probability of error and the relative entropy between the channel and the auxiliary output distribution. This inequality is then used to show the strong converse, and to prove that the output distribution of a code must be close, in relative entropy, to the capacity achieving output distribution (for DMC and AWGN). One of the key tools in our analysis is the concentration of measure (isoperimetry).
first_indexed 2024-09-23T11:05:32Z
format Article
id mit-1721.1/79672
institution Massachusetts Institute of Technology
language en_US
last_indexed 2024-09-23T11:05:32Z
publishDate 2013
publisher Institute of Electrical and Electronics Engineers (IEEE)
record_format dspace
spelling mit-1721.1/796722022-09-27T17:04:03Z Relative entropy at the channel output of a capacity-achieving code Polyanskiy, Yury Verdu, Sergio Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science Massachusetts Institute of Technology. Laboratory for Information and Decision Systems Polyanskiy, Yury Verdu, Sergio In this paper we establish a new inequality tying together the coding rate, the probability of error and the relative entropy between the channel and the auxiliary output distribution. This inequality is then used to show the strong converse, and to prove that the output distribution of a code must be close, in relative entropy, to the capacity achieving output distribution (for DMC and AWGN). One of the key tools in our analysis is the concentration of measure (isoperimetry). National Science Foundation (U.S.) (Grant CCF-06-35154) National Science Foundation (U.S.) (Grant CCF-07-28445) 2013-07-23T13:32:11Z 2013-07-23T13:32:11Z 2011-09 Article http://purl.org/eprint/type/ConferencePaper 978-1-4577-1818-2 978-1-4577-1817-5 978-1-4577-1816-8 http://hdl.handle.net/1721.1/79672 Polyanskiy, Yury, and Sergio Verdu. Relative Entropy at the Channel Output of a Capacity-achieving Code. In 2011 49th Annual Allerton Conference on Communication, Control, and Computing (Allerton), 52-59. Institute of Electrical and Electronics Engineers, 2011. https://orcid.org/0000-0002-2109-0979 en_US http://dx.doi.org/10.1109/Allerton.2011.6120149 Proceedings of the 2011 49th Annual Allerton Conference on Communication, Control, and Computing (Allerton) Creative Commons Attribution-Noncommercial-Share Alike 3.0 http://creativecommons.org/licenses/by-nc-sa/3.0/ application/pdf Institute of Electrical and Electronics Engineers (IEEE) Polyanskiy via Amy Stout
spellingShingle Polyanskiy, Yury
Verdu, Sergio
Relative entropy at the channel output of a capacity-achieving code
title Relative entropy at the channel output of a capacity-achieving code
title_full Relative entropy at the channel output of a capacity-achieving code
title_fullStr Relative entropy at the channel output of a capacity-achieving code
title_full_unstemmed Relative entropy at the channel output of a capacity-achieving code
title_short Relative entropy at the channel output of a capacity-achieving code
title_sort relative entropy at the channel output of a capacity achieving code
url http://hdl.handle.net/1721.1/79672
https://orcid.org/0000-0002-2109-0979
work_keys_str_mv AT polyanskiyyury relativeentropyatthechanneloutputofacapacityachievingcode
AT verdusergio relativeentropyatthechanneloutputofacapacityachievingcode