Axiomatic Characterizations of Information Measures
Axiomatic characterizations of Shannon entropy, Kullback I-divergence, and some generalized information measures are surveyed. Three directions are treated: (A) Characterization of functions of probability distributions suitable as information measures. (B) Characterization of set functions on the s...
Main Author: | Imre Csiszár |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2008-09-01
|
Series: | Entropy |
Subjects: | |
Online Access: | http://www.mdpi.com/1099-4300/10/3/261/ |
Similar Items
-
Entropy and the Kullback–Leibler Divergence for Bayesian Networks: Computational Complexity and Efficient Implementation
by: Marco Scutari
Published: (2024-01-01) -
Majorization, Csiszár divergence and Zipf-Mandelbrot law
by: Naveed Latif, et al.
Published: (2017-08-01) -
Paradigms of Cognition
by: Flemming Topsøe
Published: (2017-03-01) -
On Voronoi Diagrams on the Information-Geometric Cauchy Manifolds
by: Frank Nielsen
Published: (2020-06-01) -
Are Guessing, Source Coding and Tasks Partitioning Birds of A Feather?
by: M. Ashok Kumar, et al.
Published: (2022-11-01)