Generalized Mutual Information

Mutual information is one of the essential building blocks of information theory. It is however only finitely defined for distributions in a subclass of the general class of all distributions on a joint alphabet. The unboundedness of mutual information prevents its potential utility from being exten...

Full description

Bibliographic Details
Main Author: Zhiyi Zhang
Format: Article
Language:English
Published: MDPI AG 2020-06-01
Series:Stats
Subjects:
Online Access:https://www.mdpi.com/2571-905X/3/2/13
_version_ 1797565685164933120
author Zhiyi Zhang
author_facet Zhiyi Zhang
author_sort Zhiyi Zhang
collection DOAJ
description Mutual information is one of the essential building blocks of information theory. It is however only finitely defined for distributions in a subclass of the general class of all distributions on a joint alphabet. The unboundedness of mutual information prevents its potential utility from being extended to the general class. This is in fact a void in the foundation of information theory that needs to be filled. This article proposes a family of generalized mutual information whose members are indexed by a positive integer <i>n</i>, with the <i>n</i>th member being the mutual information of <i>n</i>th order. The mutual information of the first order coincides with Shannon’s, which may or may not be finite. It is however established (a) that each mutual information of an order greater than 1 is finitely defined for all distributions of two random elements on a joint countable alphabet, and (b) that each and every member of the family enjoys all the utilities of a finite Shannon’s mutual information.
first_indexed 2024-03-10T19:16:32Z
format Article
id doaj.art-f5eacff38fe14834a06e7c071bc99c3d
institution Directory Open Access Journal
issn 2571-905X
language English
last_indexed 2024-03-10T19:16:32Z
publishDate 2020-06-01
publisher MDPI AG
record_format Article
series Stats
spelling doaj.art-f5eacff38fe14834a06e7c071bc99c3d2023-11-20T03:24:49ZengMDPI AGStats2571-905X2020-06-013215816510.3390/stats3020013Generalized Mutual InformationZhiyi Zhang0Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223, USAMutual information is one of the essential building blocks of information theory. It is however only finitely defined for distributions in a subclass of the general class of all distributions on a joint alphabet. The unboundedness of mutual information prevents its potential utility from being extended to the general class. This is in fact a void in the foundation of information theory that needs to be filled. This article proposes a family of generalized mutual information whose members are indexed by a positive integer <i>n</i>, with the <i>n</i>th member being the mutual information of <i>n</i>th order. The mutual information of the first order coincides with Shannon’s, which may or may not be finite. It is however established (a) that each mutual information of an order greater than 1 is finitely defined for all distributions of two random elements on a joint countable alphabet, and (b) that each and every member of the family enjoys all the utilities of a finite Shannon’s mutual information.https://www.mdpi.com/2571-905X/3/2/13mutual informationShannon’s entropyconditional distribution of total collisiongeneralized entropygeneralized mutual information
spellingShingle Zhiyi Zhang
Generalized Mutual Information
Stats
mutual information
Shannon’s entropy
conditional distribution of total collision
generalized entropy
generalized mutual information
title Generalized Mutual Information
title_full Generalized Mutual Information
title_fullStr Generalized Mutual Information
title_full_unstemmed Generalized Mutual Information
title_short Generalized Mutual Information
title_sort generalized mutual information
topic mutual information
Shannon’s entropy
conditional distribution of total collision
generalized entropy
generalized mutual information
url https://www.mdpi.com/2571-905X/3/2/13
work_keys_str_mv AT zhiyizhang generalizedmutualinformation