The entropy of a mixture of probability distributions
If a message can have n different values and all values are equally probable, then the entropy of the message is log(n). In the present paper, we investigate the expectation value of the entropy, for arbitrary probability distribution. For that purpose, we apply mixed probability distributions. The...
Main Author: | Alexis Vos |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2005-01-01
|
Series: | Entropy |
Subjects: | |
Online Access: | http://www.mdpi.com/1099-4300/7/1/15/ |
Similar Items
-
Hilbert space methods in probability and statistical inference /
by: 330756 Small, Christopher G., et al.
Published: (1994) -
Probability distributions on linear spaces /
by: 403196 Vakhania, N. N.
Published: (1981) -
Reproducing kernel Hilbert spaces in probability and statistics /
by: 304085 Berlinet, Alain, et al.
Published: (2004) -
Modeling Insurance Claim Distribution via Mixture Distribution and Copula
by: Saeed Bajalan, et al.
Published: (2017-04-01) -
Determination of the Maxwell-Boltzmann Distribution Probability for Different Gas Mixtures
by: Ibrahim Kaittan Fayyadh, et al.
Published: (2014-07-01)