Error Exponents and <i>α</i>-Mutual Information
Over the last six decades, the representation of error exponent functions for data transmission through noisy channels at rates below capacity has seen three distinct approaches: (1) Through Gallager’s <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="...
Main Author: | |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2021-02-01
|
Series: | Entropy |
Subjects: | |
Online Access: | https://www.mdpi.com/1099-4300/23/2/199 |
_version_ | 1797413944900452352 |
---|---|
author | Sergio Verdú |
author_facet | Sergio Verdú |
author_sort | Sergio Verdú |
collection | DOAJ |
description | Over the last six decades, the representation of error exponent functions for data transmission through noisy channels at rates below capacity has seen three distinct approaches: (1) Through Gallager’s <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msub><mi>E</mi><mn>0</mn></msub></semantics></math></inline-formula> functions (with and without cost constraints); (2) large deviations form, in terms of conditional relative entropy and mutual information; (3) through the <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mi>α</mi></semantics></math></inline-formula>-mutual information and the Augustin–Csiszár mutual information of order <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mi>α</mi></semantics></math></inline-formula> derived from the Rényi divergence. While a fairly complete picture has emerged in the absence of cost constraints, there have remained gaps in the interrelationships between the three approaches in the general case of cost-constrained encoding. Furthermore, no systematic approach has been proposed to solve the attendant optimization problems by exploiting the specific structure of the information functions. This paper closes those gaps and proposes a simple method to maximize Augustin–Csiszár mutual information of order <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mi>α</mi></semantics></math></inline-formula> under cost constraints by means of the maximization of the <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mi>α</mi></semantics></math></inline-formula>-mutual information subject to an exponential average constraint. |
first_indexed | 2024-03-09T05:25:20Z |
format | Article |
id | doaj.art-71b3ba253e7c49ddb39bda170121c502 |
institution | Directory Open Access Journal |
issn | 1099-4300 |
language | English |
last_indexed | 2024-03-09T05:25:20Z |
publishDate | 2021-02-01 |
publisher | MDPI AG |
record_format | Article |
series | Entropy |
spelling | doaj.art-71b3ba253e7c49ddb39bda170121c5022023-12-03T12:36:57ZengMDPI AGEntropy1099-43002021-02-0123219910.3390/e23020199Error Exponents and <i>α</i>-Mutual InformationSergio Verdú0Independent Researcher, Princeton, NJ 08540, USAOver the last six decades, the representation of error exponent functions for data transmission through noisy channels at rates below capacity has seen three distinct approaches: (1) Through Gallager’s <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msub><mi>E</mi><mn>0</mn></msub></semantics></math></inline-formula> functions (with and without cost constraints); (2) large deviations form, in terms of conditional relative entropy and mutual information; (3) through the <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mi>α</mi></semantics></math></inline-formula>-mutual information and the Augustin–Csiszár mutual information of order <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mi>α</mi></semantics></math></inline-formula> derived from the Rényi divergence. While a fairly complete picture has emerged in the absence of cost constraints, there have remained gaps in the interrelationships between the three approaches in the general case of cost-constrained encoding. Furthermore, no systematic approach has been proposed to solve the attendant optimization problems by exploiting the specific structure of the information functions. This paper closes those gaps and proposes a simple method to maximize Augustin–Csiszár mutual information of order <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mi>α</mi></semantics></math></inline-formula> under cost constraints by means of the maximization of the <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mi>α</mi></semantics></math></inline-formula>-mutual information subject to an exponential average constraint.https://www.mdpi.com/1099-4300/23/2/199information measuresrelative entropyRényi divergencemutual informationα-mutual informationAugustin–Csiszár mutual information |
spellingShingle | Sergio Verdú Error Exponents and <i>α</i>-Mutual Information Entropy information measures relative entropy Rényi divergence mutual information α-mutual information Augustin–Csiszár mutual information |
title | Error Exponents and <i>α</i>-Mutual Information |
title_full | Error Exponents and <i>α</i>-Mutual Information |
title_fullStr | Error Exponents and <i>α</i>-Mutual Information |
title_full_unstemmed | Error Exponents and <i>α</i>-Mutual Information |
title_short | Error Exponents and <i>α</i>-Mutual Information |
title_sort | error exponents and i α i mutual information |
topic | information measures relative entropy Rényi divergence mutual information α-mutual information Augustin–Csiszár mutual information |
url | https://www.mdpi.com/1099-4300/23/2/199 |
work_keys_str_mv | AT sergioverdu errorexponentsandiaimutualinformation |