Error Exponents and <i>α</i>-Mutual Information
Over the last six decades, the representation of error exponent functions for data transmission through noisy channels at rates below capacity has seen three distinct approaches: (1) Through Gallager’s <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="...
Main Author: | Sergio Verdú |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2021-02-01
|
Series: | Entropy |
Subjects: | |
Online Access: | https://www.mdpi.com/1099-4300/23/2/199 |
Similar Items
-
Conditional Rényi Divergence Saddlepoint and the Maximization of <i>α</i>-Mutual Information
by: Changxiao Cai, et al.
Published: (2019-10-01) -
Conditional Rényi Entropy and the Relationships between Rényi Capacities
by: Gautam Aishwarya, et al.
Published: (2020-05-01) -
Testing Nonlinearity with Rényi and Tsallis Mutual Information with an Application in the EKC Hypothesis
by: Elif Tuna, et al.
Published: (2022-12-01) -
A Two-Moment Inequality with Applications to Rényi Entropy and Mutual Information
by: Galen Reeves
Published: (2020-11-01) -
Mutual Information: A way to quantify correlations
by: Marcelo Tisoc, et al.
Published: (2022-08-01)