Dissipation of Information in Channels With Input Constraints
One of the basic tenets in information theory, the data processing inequality states that the output divergence does not exceed the input divergence for any channel. For channels without input constraints, various estimates on the amount of such contraction are known, Dobrushin's coefficient fo...
Main Authors: | Polyanskiy, Yury, Wu, Yihong |
---|---|
Other Authors: | Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science |
Format: | Article |
Language: | en_US |
Published: |
Institute of Electrical and Electronics Engineers (IEEE)
2017
|
Online Access: | http://hdl.handle.net/1721.1/111026 https://orcid.org/0000-0002-2109-0979 |
Similar Items
-
Strong Data Processing Inequalities for Input Constrained Additive Noise Channels
by: Calmon, Flavio du Pin, et al.
Published: (2021) -
Strong Data-Processing Inequalities for Channels and Bayesian Networks
by: Polyanskiy, Yury, et al.
Published: (2021) -
Wasserstein Continuity of Entropy and Outer Bounds for Interference Channels
by: Polyanskiy, Yury, et al.
Published: (2019) -
Peak-to-Average Power Ratio of Good Codes for Gaussian Channel
by: Polyanskiy, Yury, et al.
Published: (2016) -
Application of the information-percolation method to reconstruction problems on graphs
by: Polyanskiy, Yury, et al.
Published: (2021)