Quantifying Synergistic Information Using Intermediate Stochastic Variables †

Quantifying synergy among stochastic variables is an important open problem in information theory. Information synergy occurs when multiple sources together predict an outcome variable better than the sum of single-source predictions. It is an essential phenomenon in biology such as in neuronal netw...

Full description

Bibliographic Details
Main Authors: Rick Quax, Omri Har-Shemesh, Peter M. A. Sloot
Format: Article
Language:English
Published: MDPI AG 2017-02-01
Series:Entropy
Subjects:
Online Access:http://www.mdpi.com/1099-4300/19/2/85
_version_ 1818036788504035328
author Rick Quax
Omri Har-Shemesh
Peter M. A. Sloot
author_facet Rick Quax
Omri Har-Shemesh
Peter M. A. Sloot
author_sort Rick Quax
collection DOAJ
description Quantifying synergy among stochastic variables is an important open problem in information theory. Information synergy occurs when multiple sources together predict an outcome variable better than the sum of single-source predictions. It is an essential phenomenon in biology such as in neuronal networks and cellular regulatory processes, where different information flows integrate to produce a single response, but also in social cooperation processes as well as in statistical inference tasks in machine learning. Here we propose a metric of synergistic entropy and synergistic information from first principles. The proposed measure relies on so-called synergistic random variables (SRVs) which are constructed to have zero mutual information about individual source variables but non-zero mutual information about the complete set of source variables. We prove several basic and desired properties of our measure, including bounds and additivity properties. In addition, we prove several important consequences of our measure, including the fact that different types of synergistic information may co-exist between the same sets of variables. A numerical implementation is provided, which we use to demonstrate that synergy is associated with resilience to noise. Our measure may be a marked step forward in the study of multivariate information theory and its numerous applications.
first_indexed 2024-12-10T07:16:31Z
format Article
id doaj.art-38868e20038b4c15b9bd3449fb0428ad
institution Directory Open Access Journal
issn 1099-4300
language English
last_indexed 2024-12-10T07:16:31Z
publishDate 2017-02-01
publisher MDPI AG
record_format Article
series Entropy
spelling doaj.art-38868e20038b4c15b9bd3449fb0428ad2022-12-22T01:57:55ZengMDPI AGEntropy1099-43002017-02-011928510.3390/e19020085e19020085Quantifying Synergistic Information Using Intermediate Stochastic Variables †Rick Quax0Omri Har-Shemesh1Peter M. A. Sloot2Computational Science Lab, University of Amsterdam, 1098 XH Amsterdam, The NetherlandsComputational Science Lab, University of Amsterdam, 1098 XH Amsterdam, The NetherlandsThe Institute for Advanced Study, University of Amsterdam, Oude Turfmarkt 147, 1012 GC Amsterdam, The NetherlandsQuantifying synergy among stochastic variables is an important open problem in information theory. Information synergy occurs when multiple sources together predict an outcome variable better than the sum of single-source predictions. It is an essential phenomenon in biology such as in neuronal networks and cellular regulatory processes, where different information flows integrate to produce a single response, but also in social cooperation processes as well as in statistical inference tasks in machine learning. Here we propose a metric of synergistic entropy and synergistic information from first principles. The proposed measure relies on so-called synergistic random variables (SRVs) which are constructed to have zero mutual information about individual source variables but non-zero mutual information about the complete set of source variables. We prove several basic and desired properties of our measure, including bounds and additivity properties. In addition, we prove several important consequences of our measure, including the fact that different types of synergistic information may co-exist between the same sets of variables. A numerical implementation is provided, which we use to demonstrate that synergy is associated with resilience to noise. Our measure may be a marked step forward in the study of multivariate information theory and its numerous applications.http://www.mdpi.com/1099-4300/19/2/85synergysynergistic informationsynergistic entropyinformation theorystochastic variableshigher order information
spellingShingle Rick Quax
Omri Har-Shemesh
Peter M. A. Sloot
Quantifying Synergistic Information Using Intermediate Stochastic Variables †
Entropy
synergy
synergistic information
synergistic entropy
information theory
stochastic variables
higher order information
title Quantifying Synergistic Information Using Intermediate Stochastic Variables †
title_full Quantifying Synergistic Information Using Intermediate Stochastic Variables †
title_fullStr Quantifying Synergistic Information Using Intermediate Stochastic Variables †
title_full_unstemmed Quantifying Synergistic Information Using Intermediate Stochastic Variables †
title_short Quantifying Synergistic Information Using Intermediate Stochastic Variables †
title_sort quantifying synergistic information using intermediate stochastic variables †
topic synergy
synergistic information
synergistic entropy
information theory
stochastic variables
higher order information
url http://www.mdpi.com/1099-4300/19/2/85
work_keys_str_mv AT rickquax quantifyingsynergisticinformationusingintermediatestochasticvariables
AT omriharshemesh quantifyingsynergisticinformationusingintermediatestochasticvariables
AT petermasloot quantifyingsynergisticinformationusingintermediatestochasticvariables