Quantifying Redundant Information in Predicting a Target Random Variable
We consider the problem of defining a measure of redundant information that quantifies how much common information two or more random variables specify about a target random variable. We discussed desired properties of such a measure, and propose new measures with some desirable properties.
Main Authors: | Virgil Griffith, Tracey Ho |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2015-07-01
|
Series: | Entropy |
Subjects: | |
Online Access: | http://www.mdpi.com/1099-4300/17/7/4644 |
Similar Items
-
Quantifying Synergistic Information Using Intermediate Stochastic Variables †
by: Rick Quax, et al.
Published: (2017-02-01) -
Intersection Information Based on Common Randomness
by: Virgil Griffith, et al.
Published: (2014-04-01) -
Invariant Components of Synergy, Redundancy, and Unique Information among Three Variables
by: Giuseppe Pica, et al.
Published: (2017-08-01) -
The Identity of Information: How Deterministic Dependencies Constrain Information Synergy and Redundancy
by: Daniel Chicharro, et al.
Published: (2018-03-01) -
Synergy and Redundancy in Dual Decompositions of Mutual Information Gain and Information Loss
by: Daniel Chicharro, et al.
Published: (2017-02-01)