Assessing Probabilistic Inference by Comparing the Generalized Mean of the Model and Source Probabilities

An approach to the assessment of probabilistic inference is described which quantifies the performance on the probability scale. From both information and Bayesian theory, the central tendency of an inference is proven to be the geometric mean of the probabilities reported for the actual outcome and...

Full description

Bibliographic Details
Main Author: Kenric P. Nelson
Format: Article
Language:English
Published: MDPI AG 2017-06-01
Series:Entropy
Subjects:
Online Access:http://www.mdpi.com/1099-4300/19/6/286
_version_ 1798034325210398720
author Kenric P. Nelson
author_facet Kenric P. Nelson
author_sort Kenric P. Nelson
collection DOAJ
description An approach to the assessment of probabilistic inference is described which quantifies the performance on the probability scale. From both information and Bayesian theory, the central tendency of an inference is proven to be the geometric mean of the probabilities reported for the actual outcome and is referred to as the “Accuracy”. Upper and lower error bars on the accuracy are provided by the arithmetic mean and the −2/3 mean. The arithmetic is called the “Decisiveness” due to its similarity with the cost of a decision and the −2/3 mean is called the “Robustness”, due to its sensitivity to outlier errors. Visualization of inference performance is facilitated by plotting the reported model probabilities versus the histogram calculated source probabilities. The visualization of the calibration between model and source is summarized on both axes by the arithmetic, geometric, and −2/3 means. From information theory, the performance of the inference is related to the cross-entropy between the model and source distribution. Just as cross-entropy is the sum of the entropy and the divergence; the accuracy of a model can be decomposed into a component due to the source uncertainty and the divergence between the source and model. Translated to the probability domain these quantities are plotted as the average model probability versus the average source probability. The divergence probability is the average model probability divided by the average source probability. When an inference is over/under-confident, the arithmetic mean of the model increases/decreases, while the −2/3 mean decreases/increases, respectively.
first_indexed 2024-04-11T20:42:38Z
format Article
id doaj.art-81f30d2b7fb54050b5b17763d672201b
institution Directory Open Access Journal
issn 1099-4300
language English
last_indexed 2024-04-11T20:42:38Z
publishDate 2017-06-01
publisher MDPI AG
record_format Article
series Entropy
spelling doaj.art-81f30d2b7fb54050b5b17763d672201b2022-12-22T04:04:10ZengMDPI AGEntropy1099-43002017-06-0119628610.3390/e19060286e19060286Assessing Probabilistic Inference by Comparing the Generalized Mean of the Model and Source ProbabilitiesKenric P. Nelson0Electrical and Computer Engineering Department, Boston University, Boston, MA 02215, USAAn approach to the assessment of probabilistic inference is described which quantifies the performance on the probability scale. From both information and Bayesian theory, the central tendency of an inference is proven to be the geometric mean of the probabilities reported for the actual outcome and is referred to as the “Accuracy”. Upper and lower error bars on the accuracy are provided by the arithmetic mean and the −2/3 mean. The arithmetic is called the “Decisiveness” due to its similarity with the cost of a decision and the −2/3 mean is called the “Robustness”, due to its sensitivity to outlier errors. Visualization of inference performance is facilitated by plotting the reported model probabilities versus the histogram calculated source probabilities. The visualization of the calibration between model and source is summarized on both axes by the arithmetic, geometric, and −2/3 means. From information theory, the performance of the inference is related to the cross-entropy between the model and source distribution. Just as cross-entropy is the sum of the entropy and the divergence; the accuracy of a model can be decomposed into a component due to the source uncertainty and the divergence between the source and model. Translated to the probability domain these quantities are plotted as the average model probability versus the average source probability. The divergence probability is the average model probability divided by the average source probability. When an inference is over/under-confident, the arithmetic mean of the model increases/decreases, while the −2/3 mean decreases/increases, respectively.http://www.mdpi.com/1099-4300/19/6/286probabilityinferenceinformation theoryBayesiangeneralized mean
spellingShingle Kenric P. Nelson
Assessing Probabilistic Inference by Comparing the Generalized Mean of the Model and Source Probabilities
Entropy
probability
inference
information theory
Bayesian
generalized mean
title Assessing Probabilistic Inference by Comparing the Generalized Mean of the Model and Source Probabilities
title_full Assessing Probabilistic Inference by Comparing the Generalized Mean of the Model and Source Probabilities
title_fullStr Assessing Probabilistic Inference by Comparing the Generalized Mean of the Model and Source Probabilities
title_full_unstemmed Assessing Probabilistic Inference by Comparing the Generalized Mean of the Model and Source Probabilities
title_short Assessing Probabilistic Inference by Comparing the Generalized Mean of the Model and Source Probabilities
title_sort assessing probabilistic inference by comparing the generalized mean of the model and source probabilities
topic probability
inference
information theory
Bayesian
generalized mean
url http://www.mdpi.com/1099-4300/19/6/286
work_keys_str_mv AT kenricpnelson assessingprobabilisticinferencebycomparingthegeneralizedmeanofthemodelandsourceprobabilities