Generalizing Information to the Evolution of Rational Belief

Information theory provides a mathematical foundation to measure uncertainty in belief. Belief is represented by a probability distribution that captures our understanding of an outcome’s plausibility. Information measures based on Shannon’s concept of entropy include realization...

Full description

Bibliographic Details
Main Authors: Jed A. Duersch, Thomas A. Catanach
Format: Article
Language:English
Published: MDPI AG 2020-01-01
Series:Entropy
Subjects:
Online Access:https://www.mdpi.com/1099-4300/22/1/108
_version_ 1798024214316318720
author Jed A. Duersch
Thomas A. Catanach
author_facet Jed A. Duersch
Thomas A. Catanach
author_sort Jed A. Duersch
collection DOAJ
description Information theory provides a mathematical foundation to measure uncertainty in belief. Belief is represented by a probability distribution that captures our understanding of an outcome’s plausibility. Information measures based on Shannon’s concept of entropy include realization information, Kullback−Leibler divergence, Lindley’s information in experiment, cross entropy, and mutual information. We derive a general theory of information from first principles that accounts for evolving belief and recovers all of these measures. Rather than simply gauging uncertainty, information is understood in this theory to measure change in belief. We may then regard entropy as the information we expect to gain upon realization of a discrete latent random variable. This theory of information is compatible with the Bayesian paradigm in which rational belief is updated as evidence becomes available. Furthermore, this theory admits novel measures of information with well-defined properties, which we explored in both analysis and experiment. This view of information illuminates the study of machine learning by allowing us to quantify information captured by a predictive model and distinguish it from residual information contained in training data. We gain related insights regarding feature selection, anomaly detection, and novel Bayesian approaches.
first_indexed 2024-04-11T17:58:47Z
format Article
id doaj.art-87846fd680544394ac06a201758453de
institution Directory Open Access Journal
issn 1099-4300
language English
last_indexed 2024-04-11T17:58:47Z
publishDate 2020-01-01
publisher MDPI AG
record_format Article
series Entropy
spelling doaj.art-87846fd680544394ac06a201758453de2022-12-22T04:10:34ZengMDPI AGEntropy1099-43002020-01-0122110810.3390/e22010108e22010108Generalizing Information to the Evolution of Rational BeliefJed A. Duersch0Thomas A. Catanach1Sandia National Laboratories, Livermore, CA 94550, USASandia National Laboratories, Livermore, CA 94550, USAInformation theory provides a mathematical foundation to measure uncertainty in belief. Belief is represented by a probability distribution that captures our understanding of an outcome’s plausibility. Information measures based on Shannon’s concept of entropy include realization information, Kullback−Leibler divergence, Lindley’s information in experiment, cross entropy, and mutual information. We derive a general theory of information from first principles that accounts for evolving belief and recovers all of these measures. Rather than simply gauging uncertainty, information is understood in this theory to measure change in belief. We may then regard entropy as the information we expect to gain upon realization of a discrete latent random variable. This theory of information is compatible with the Bayesian paradigm in which rational belief is updated as evidence becomes available. Furthermore, this theory admits novel measures of information with well-defined properties, which we explored in both analysis and experiment. This view of information illuminates the study of machine learning by allowing us to quantify information captured by a predictive model and distinguish it from residual information contained in training data. We gain related insights regarding feature selection, anomaly detection, and novel Bayesian approaches.https://www.mdpi.com/1099-4300/22/1/108informationbayesian inferenceentropyself informationmutual informationkullback–leibler divergencelindley informationmaximal uncertaintyproper utility
spellingShingle Jed A. Duersch
Thomas A. Catanach
Generalizing Information to the Evolution of Rational Belief
Entropy
information
bayesian inference
entropy
self information
mutual information
kullback–leibler divergence
lindley information
maximal uncertainty
proper utility
title Generalizing Information to the Evolution of Rational Belief
title_full Generalizing Information to the Evolution of Rational Belief
title_fullStr Generalizing Information to the Evolution of Rational Belief
title_full_unstemmed Generalizing Information to the Evolution of Rational Belief
title_short Generalizing Information to the Evolution of Rational Belief
title_sort generalizing information to the evolution of rational belief
topic information
bayesian inference
entropy
self information
mutual information
kullback–leibler divergence
lindley information
maximal uncertainty
proper utility
url https://www.mdpi.com/1099-4300/22/1/108
work_keys_str_mv AT jedaduersch generalizinginformationtotheevolutionofrationalbelief
AT thomasacatanach generalizinginformationtotheevolutionofrationalbelief