Mutual Information Gain and Linear/Nonlinear Redundancy for Agent Learning, Sequence Analysis, and Modeling
In many applications, intelligent agents need to identify any structure or apparent randomness in an environment and respond appropriately. We use the relative entropy to separate and quantify the presence of both linear and nonlinear redundancy in a sequence and we introduce the new quantities of t...
Main Author: | Jerry D. Gibson |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2020-05-01
|
Series: | Entropy |
Subjects: | |
Online Access: | https://www.mdpi.com/1099-4300/22/6/608 |
Similar Items
-
Synergy and Redundancy in Dual Decompositions of Mutual Information Gain and Information Loss
by: Daniel Chicharro, et al.
Published: (2017-02-01) -
Fuzzy Mutual Information Based min-Redundancy and Max-Relevance Heterogeneous Feature Selection
by: Daren Yu, et al.
Published: (2011-08-01) -
The Identity of Information: How Deterministic Dependencies Constrain Information Synergy and Redundancy
by: Daniel Chicharro, et al.
Published: (2018-03-01) -
Multi-label feature selection algorithm based on joint mutual information of max-relevance and min-redundancy
by: Li ZHANG, et al.
Published: (2018-05-01) -
Multi-label feature selection algorithm based on joint mutual information of max-relevance and min-redundancy
by: Li ZHANG, et al.
Published: (2018-05-01)