Axiomatic Characterizations of Information Measures

Axiomatic characterizations of Shannon entropy, Kullback I-divergence, and some generalized information measures are surveyed. Three directions are treated: (A) Characterization of functions of probability distributions suitable as information measures. (B) Characterization of set functions on the s...

Full description

Bibliographic Details
Main Author: Imre Csiszár
Format: Article
Language:English
Published: MDPI AG 2008-09-01
Series:Entropy
Subjects:
Online Access:http://www.mdpi.com/1099-4300/10/3/261/
_version_ 1798039103818694656
author Imre Csiszár
author_facet Imre Csiszár
author_sort Imre Csiszár
collection DOAJ
description Axiomatic characterizations of Shannon entropy, Kullback I-divergence, and some generalized information measures are surveyed. Three directions are treated: (A) Characterization of functions of probability distributions suitable as information measures. (B) Characterization of set functions on the subsets of {1; : : : ;N} representable by joint entropies of components of an N-dimensional random vector. (C) Axiomatic characterization of MaxEnt and related inference rules. The paper concludes with a brief discussion of the relevance of the axiomatic approach for information theory.
first_indexed 2024-04-11T21:49:18Z
format Article
id doaj.art-38c4a8aba56c4fffa8dddae2b9366199
institution Directory Open Access Journal
issn 1099-4300
language English
last_indexed 2024-04-11T21:49:18Z
publishDate 2008-09-01
publisher MDPI AG
record_format Article
series Entropy
spelling doaj.art-38c4a8aba56c4fffa8dddae2b93661992022-12-22T04:01:18ZengMDPI AGEntropy1099-43002008-09-0110326127310.3390/e10030261Axiomatic Characterizations of Information MeasuresImre CsiszárAxiomatic characterizations of Shannon entropy, Kullback I-divergence, and some generalized information measures are surveyed. Three directions are treated: (A) Characterization of functions of probability distributions suitable as information measures. (B) Characterization of set functions on the subsets of {1; : : : ;N} representable by joint entropies of components of an N-dimensional random vector. (C) Axiomatic characterization of MaxEnt and related inference rules. The paper concludes with a brief discussion of the relevance of the axiomatic approach for information theory.http://www.mdpi.com/1099-4300/10/3/261/Shannon entropyKullback I-divergenceRényi information measuresf- divergencef-entropyfunctional equationproper scoremaximum entropytransitive inference ruleBregman distance
spellingShingle Imre Csiszár
Axiomatic Characterizations of Information Measures
Entropy
Shannon entropy
Kullback I-divergence
Rényi information measures
f- divergence
f-entropy
functional equation
proper score
maximum entropy
transitive inference rule
Bregman distance
title Axiomatic Characterizations of Information Measures
title_full Axiomatic Characterizations of Information Measures
title_fullStr Axiomatic Characterizations of Information Measures
title_full_unstemmed Axiomatic Characterizations of Information Measures
title_short Axiomatic Characterizations of Information Measures
title_sort axiomatic characterizations of information measures
topic Shannon entropy
Kullback I-divergence
Rényi information measures
f- divergence
f-entropy
functional equation
proper score
maximum entropy
transitive inference rule
Bregman distance
url http://www.mdpi.com/1099-4300/10/3/261/
work_keys_str_mv AT imrecsiszaƒar axiomaticcharacterizationsofinformationmeasures