Exact Probability Distribution versus Entropy

The problem  addressed concerns the determination of the average number of successive attempts  of guessing  a word of a certain  length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations  to a natural language are considered.  The guessing stra...

Full description

Bibliographic Details
Main Author: Kerstin Andersson
Format: Article
Language:English
Published: MDPI AG 2014-10-01
Series:Entropy
Subjects:
Online Access:http://www.mdpi.com/1099-4300/16/10/5198
_version_ 1811279894403874816
author Kerstin Andersson
author_facet Kerstin Andersson
author_sort Kerstin Andersson
collection DOAJ
description The problem  addressed concerns the determination of the average number of successive attempts  of guessing  a word of a certain  length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations  to a natural language are considered.  The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations  are necessary in order to estimate the number of guesses.  Several  kinds of approximations  are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU) time. When considering realistic  sizes of alphabets and words (100), the number of guesses can be estimated  within minutes with reasonable accuracy (a few percent) and may therefore constitute an alternative to, e.g., various entropy expressions.  For many probability  distributions,  the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion  of guesses needed on average compared to the total number  decreases almost exponentially with the word length. The leading term in an asymptotic  expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.
first_indexed 2024-04-13T01:04:02Z
format Article
id doaj.art-2b7bb41a81934de6904e3a5552d00ce6
institution Directory Open Access Journal
issn 1099-4300
language English
last_indexed 2024-04-13T01:04:02Z
publishDate 2014-10-01
publisher MDPI AG
record_format Article
series Entropy
spelling doaj.art-2b7bb41a81934de6904e3a5552d00ce62022-12-22T03:09:23ZengMDPI AGEntropy1099-43002014-10-0116105198521010.3390/e16105198e16105198Exact Probability Distribution versus EntropyKerstin Andersson0Department of Mathematics and Computer Science, Karlstad University, SE-651 88 Karlstad, SwedenThe problem  addressed concerns the determination of the average number of successive attempts  of guessing  a word of a certain  length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations  to a natural language are considered.  The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations  are necessary in order to estimate the number of guesses.  Several  kinds of approximations  are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU) time. When considering realistic  sizes of alphabets and words (100), the number of guesses can be estimated  within minutes with reasonable accuracy (a few percent) and may therefore constitute an alternative to, e.g., various entropy expressions.  For many probability  distributions,  the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion  of guesses needed on average compared to the total number  decreases almost exponentially with the word length. The leading term in an asymptotic  expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.http://www.mdpi.com/1099-4300/16/10/5198information entropysecurityguessing
spellingShingle Kerstin Andersson
Exact Probability Distribution versus Entropy
Entropy
information entropy
security
guessing
title Exact Probability Distribution versus Entropy
title_full Exact Probability Distribution versus Entropy
title_fullStr Exact Probability Distribution versus Entropy
title_full_unstemmed Exact Probability Distribution versus Entropy
title_short Exact Probability Distribution versus Entropy
title_sort exact probability distribution versus entropy
topic information entropy
security
guessing
url http://www.mdpi.com/1099-4300/16/10/5198
work_keys_str_mv AT kerstinandersson exactprobabilitydistributionversusentropy