On Normalized Mutual Information: Measure Derivations and Properties
Starting with a new formulation for the mutual information (MI) between a pair of events, this paper derives alternative upper bounds and extends those to the case of two discrete random variables. Normalized mutual information (NMI) measures are then obtained from those bounds, emphasizing the use...
Main Author: | Tarald O. Kvålseth |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2017-11-01
|
Series: | Entropy |
Subjects: | |
Online Access: | https://www.mdpi.com/1099-4300/19/11/631 |
Similar Items
-
An Axiomatic Characterization of Mutual Information
by: James Fullwood
Published: (2023-04-01) -
Error Exponents and <i>α</i>-Mutual Information
by: Sergio Verdú
Published: (2021-02-01) -
Cumulative Measure of Inaccuracy and Mutual Information in <i>k</i>-th Lower Record Values
by: Maryam Eskandarzadeh, et al.
Published: (2019-02-01) -
An integrated review of measures of mutuality: Pros, cons, and future directions
by: Bridget Hamilton, et al.
Published: (2023-02-01) -
Tsallis Mutual Information for Document Classification
by: Màrius Vila, et al.
Published: (2011-09-01)