Relative entropy at the channel output of a capacity-achieving code
In this paper we establish a new inequality tying together the coding rate, the probability of error and the relative entropy between the channel and the auxiliary output distribution. This inequality is then used to show the strong converse, and to prove that the output distribution of a code must...
Main Authors: | Polyanskiy, Yury, Verdu, Sergio |
---|---|
Other Authors: | Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science |
Format: | Article |
Language: | en_US |
Published: |
Institute of Electrical and Electronics Engineers (IEEE)
2013
|
Online Access: | http://hdl.handle.net/1721.1/79672 https://orcid.org/0000-0002-2109-0979 |
Similar Items
-
Joint source-channel coding with feedback
by: Kostina, Victoria, et al.
Published: (2017) -
Joint Source-Channel Coding With Feedback
by: Kostina, Victoria, et al.
Published: (2019) -
ℓp-norms of codewords from capacity- and dispersion-achieveing Gaussian codes
by: Polyanskiy, Yury
Published: (2013) -
Scalar coherent fading channel: dispersion analysis
by: Polyanskiy, Yury, et al.
Published: (2013) -
Saddle Point in the Minimax Converse for Channel Coding
by: Polyanskiy, Yury
Published: (2013)