Information Distances versus Entropy Metric

Information distance has become an important tool in a wide variety of applications. Various types of information distance have been made over the years. These information distance measures are different from entropy metric, as the former is based on Kolmogorov complexity and the latter on Shannon e...

Full description

Bibliographic Details
Main Authors: Bo Hu, Lvqing Bi, Songsong Dai
Format: Article
Language:English
Published: MDPI AG 2017-06-01
Series:Entropy
Subjects:
Online Access:http://www.mdpi.com/1099-4300/19/6/260
Description
Summary:Information distance has become an important tool in a wide variety of applications. Various types of information distance have been made over the years. These information distance measures are different from entropy metric, as the former is based on Kolmogorov complexity and the latter on Shannon entropy. However, for any computable probability distributions, up to a constant, the expected value of Kolmogorov complexity equals the Shannon entropy. We study the similar relationship between entropy and information distance. We also study the relationship between entropy and the normalized versions of information distances.
ISSN:1099-4300