Dissipation of Information in Channels With Input Constraints
One of the basic tenets in information theory, the data processing inequality states that the output divergence does not exceed the input divergence for any channel. For channels without input constraints, various estimates on the amount of such contraction are known, Dobrushin's coefficient fo...
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Article |
Language: | en_US |
Published: |
Institute of Electrical and Electronics Engineers (IEEE)
2017
|
Online Access: | http://hdl.handle.net/1721.1/111026 https://orcid.org/0000-0002-2109-0979 |
_version_ | 1826200117596651520 |
---|---|
author | Polyanskiy, Yury Wu, Yihong |
author2 | Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science |
author_facet | Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science Polyanskiy, Yury Wu, Yihong |
author_sort | Polyanskiy, Yury |
collection | MIT |
description | One of the basic tenets in information theory, the data processing inequality states that the output divergence does not exceed the input divergence for any channel. For channels without input constraints, various estimates on the amount of such contraction are known, Dobrushin's coefficient for the total variation being perhaps the most well-known. This paper investigates channels with an average input cost constraint. It is found that, while the contraction coefficient typically equals one (no contraction), the information nevertheless dissipates. A certain nonlinear function, the Dobrushin curve of the channel, is proposed to quantify the amount of dissipation. Tools for evaluating the Dobrushin curve of additive-noise channels are developed based on coupling arguments. Some basic applications in stochastic control, uniqueness of Gibbs measures, and fundamental limits of noisy circuits are discussed. As an application, it is shown that, in the chain of n power-constrained relays and Gaussian channels, the end-to-end mutual information and maximal squared correlation decay as O(log log n/log n), which is in stark contrast with the exponential decay in chains of discrete channels. Similarly, the behavior of noisy circuits (composed of gates with bounded fan-in) and broadcasting of information on trees (of bounded degree) does not experience threshold behavior in the signal-to-noise ratio (SNR). Namely, unlike the case of discrete channels, the probability of bit error stays bounded away from 1/2 regardless of the SNR. |
first_indexed | 2024-09-23T11:31:47Z |
format | Article |
id | mit-1721.1/111026 |
institution | Massachusetts Institute of Technology |
language | en_US |
last_indexed | 2024-09-23T11:31:47Z |
publishDate | 2017 |
publisher | Institute of Electrical and Electronics Engineers (IEEE) |
record_format | dspace |
spelling | mit-1721.1/1110262022-10-01T04:12:24Z Dissipation of Information in Channels With Input Constraints Polyanskiy, Yury Wu, Yihong Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science Polyanskiy, Yury One of the basic tenets in information theory, the data processing inequality states that the output divergence does not exceed the input divergence for any channel. For channels without input constraints, various estimates on the amount of such contraction are known, Dobrushin's coefficient for the total variation being perhaps the most well-known. This paper investigates channels with an average input cost constraint. It is found that, while the contraction coefficient typically equals one (no contraction), the information nevertheless dissipates. A certain nonlinear function, the Dobrushin curve of the channel, is proposed to quantify the amount of dissipation. Tools for evaluating the Dobrushin curve of additive-noise channels are developed based on coupling arguments. Some basic applications in stochastic control, uniqueness of Gibbs measures, and fundamental limits of noisy circuits are discussed. As an application, it is shown that, in the chain of n power-constrained relays and Gaussian channels, the end-to-end mutual information and maximal squared correlation decay as O(log log n/log n), which is in stark contrast with the exponential decay in chains of discrete channels. Similarly, the behavior of noisy circuits (composed of gates with bounded fan-in) and broadcasting of information on trees (of bounded degree) does not experience threshold behavior in the signal-to-noise ratio (SNR). Namely, unlike the case of discrete channels, the probability of bit error stays bounded away from 1/2 regardless of the SNR. 2017-08-28T17:44:59Z 2017-08-28T17:44:59Z 2015-09 Article http://purl.org/eprint/type/JournalArticle 0018-9448 1557-9654 http://hdl.handle.net/1721.1/111026 Polyanskiy, Yury and Wu, Yihong. “Dissipation of Information in Channels With Input Constraints.” IEEE Transactions on Information Theory 62, 1 (January 2016): 35–55 © 2016 Institute of Electrical and Electronics Engineers (IEEE) https://orcid.org/0000-0002-2109-0979 en_US http://dx.doi.org/10.1109/TIT.2015.2482978 IEEE Transactions on Information Theory Creative Commons Attribution-Noncommercial-Share Alike http://creativecommons.org/licenses/by-nc-sa/4.0/ application/pdf Institute of Electrical and Electronics Engineers (IEEE) arXiv |
spellingShingle | Polyanskiy, Yury Wu, Yihong Dissipation of Information in Channels With Input Constraints |
title | Dissipation of Information in Channels With Input Constraints |
title_full | Dissipation of Information in Channels With Input Constraints |
title_fullStr | Dissipation of Information in Channels With Input Constraints |
title_full_unstemmed | Dissipation of Information in Channels With Input Constraints |
title_short | Dissipation of Information in Channels With Input Constraints |
title_sort | dissipation of information in channels with input constraints |
url | http://hdl.handle.net/1721.1/111026 https://orcid.org/0000-0002-2109-0979 |
work_keys_str_mv | AT polyanskiyyury dissipationofinformationinchannelswithinputconstraints AT wuyihong dissipationofinformationinchannelswithinputconstraints |