The entropy of a mixture of probability distributions

If a message can have n different values and all values are equally probable, then the entropy of the message is log(n). In the present paper, we investigate the expectation value of the entropy, for arbitrary probability distribution. For that purpose, we apply mixed probability distributions. The...

Full description

Bibliographic Details
Main Author: Alexis Vos
Format: Article
Language:English
Published: MDPI AG 2005-01-01
Series:Entropy
Subjects:
Online Access:http://www.mdpi.com/1099-4300/7/1/15/