On Data-Processing and Majorization Inequalities for <i>f</i>-Divergences with Applications
This paper is focused on the derivation of data-processing and majorization inequalities for <i>f</i>-divergences, and their applications in information theory and statistics. For the accessibility of the material, the main results are first introduced without proofs, followed by exempli...
Main Author: | |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2019-10-01
|
Series: | Entropy |
Subjects: | |
Online Access: | https://www.mdpi.com/1099-4300/21/10/1022 |
_version_ | 1811308050172084224 |
---|---|
author | Igal Sason |
author_facet | Igal Sason |
author_sort | Igal Sason |
collection | DOAJ |
description | This paper is focused on the derivation of data-processing and majorization inequalities for <i>f</i>-divergences, and their applications in information theory and statistics. For the accessibility of the material, the main results are first introduced without proofs, followed by exemplifications of the theorems with further related analytical results, interpretations, and information-theoretic applications. One application refers to the performance analysis of list decoding with either fixed or variable list sizes; some earlier bounds on the list decoding error probability are reproduced in a unified way, and new bounds are obtained and exemplified numerically. Another application is related to a study of the quality of approximating a probability mass function, induced by the leaves of a Tunstall tree, by an equiprobable distribution. The compression rates of finite-length Tunstall codes are further analyzed for asserting their closeness to the Shannon entropy of a memoryless and stationary discrete source. Almost all the analysis is relegated to the appendices, which form the major part of this manuscript. |
first_indexed | 2024-04-13T09:16:18Z |
format | Article |
id | doaj.art-13104f5116de45a7a9f0ecd667033a98 |
institution | Directory Open Access Journal |
issn | 1099-4300 |
language | English |
last_indexed | 2024-04-13T09:16:18Z |
publishDate | 2019-10-01 |
publisher | MDPI AG |
record_format | Article |
series | Entropy |
spelling | doaj.art-13104f5116de45a7a9f0ecd667033a982022-12-22T02:52:44ZengMDPI AGEntropy1099-43002019-10-012110102210.3390/e21101022e21101022On Data-Processing and Majorization Inequalities for <i>f</i>-Divergences with ApplicationsIgal Sason0Department of Electrical Engineering, Technion—Israel Institute of Technology, Haifa 3200003, IsraelThis paper is focused on the derivation of data-processing and majorization inequalities for <i>f</i>-divergences, and their applications in information theory and statistics. For the accessibility of the material, the main results are first introduced without proofs, followed by exemplifications of the theorems with further related analytical results, interpretations, and information-theoretic applications. One application refers to the performance analysis of list decoding with either fixed or variable list sizes; some earlier bounds on the list decoding error probability are reproduced in a unified way, and new bounds are obtained and exemplified numerically. Another application is related to a study of the quality of approximating a probability mass function, induced by the leaves of a Tunstall tree, by an equiprobable distribution. The compression rates of finite-length Tunstall codes are further analyzed for asserting their closeness to the Shannon entropy of a memoryless and stationary discrete source. Almost all the analysis is relegated to the appendices, which form the major part of this manuscript.https://www.mdpi.com/1099-4300/21/10/1022contraction coefficientdata-processing inequalities<i>f</i>-divergenceshypothesis testinglist decodingmajorization theoryrényi information measurestsallis entropytunstall trees |
spellingShingle | Igal Sason On Data-Processing and Majorization Inequalities for <i>f</i>-Divergences with Applications Entropy contraction coefficient data-processing inequalities <i>f</i>-divergences hypothesis testing list decoding majorization theory rényi information measures tsallis entropy tunstall trees |
title | On Data-Processing and Majorization Inequalities for <i>f</i>-Divergences with Applications |
title_full | On Data-Processing and Majorization Inequalities for <i>f</i>-Divergences with Applications |
title_fullStr | On Data-Processing and Majorization Inequalities for <i>f</i>-Divergences with Applications |
title_full_unstemmed | On Data-Processing and Majorization Inequalities for <i>f</i>-Divergences with Applications |
title_short | On Data-Processing and Majorization Inequalities for <i>f</i>-Divergences with Applications |
title_sort | on data processing and majorization inequalities for i f i divergences with applications |
topic | contraction coefficient data-processing inequalities <i>f</i>-divergences hypothesis testing list decoding majorization theory rényi information measures tsallis entropy tunstall trees |
url | https://www.mdpi.com/1099-4300/21/10/1022 |
work_keys_str_mv | AT igalsason ondataprocessingandmajorizationinequalitiesforifidivergenceswithapplications |