Exact Test of Independence Using Mutual Information
Using a recently discovered method for producing random symbol sequences with prescribed transition counts, we present an exact null hypothesis significance test (NHST) for mutual information between two random variables, the null hypothesis being that the mutual information is zero (i.e., independe...
Main Authors: | Shawn D. Pethel, Daniel W. Hahs |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2014-05-01
|
Series: | Entropy |
Subjects: | |
Online Access: | http://www.mdpi.com/1099-4300/16/5/2839 |
Similar Items
-
A Conditional Mutual Information Estimator for Mixed Data and an Associated Conditional Independence Test
by: Lei Zan, et al.
Published: (2022-09-01) -
Measuring Independence between Statistical Randomness Tests by Mutual Information
by: Jorge Augusto Karell-Albo, et al.
Published: (2020-07-01) -
Complexity Reduction in Analyzing Independence between Statistical Randomness Tests Using Mutual Information
by: Jorge Augusto Karell-Albo, et al.
Published: (2023-11-01) -
An Estimator of Mutual Information and its Application to Independence Testing
by: Joe Suzuki
Published: (2016-03-01) -
Testing Nonlinearity with Rényi and Tsallis Mutual Information with an Application in the EKC Hypothesis
by: Elif Tuna, et al.
Published: (2022-12-01)