Exact Test of Independence Using Mutual Information

Using a recently discovered method for producing random symbol sequences with prescribed transition counts, we present an exact null hypothesis significance test (NHST) for mutual information between two random variables, the null hypothesis being that the mutual information is zero (i.e., independe...

Full description

Bibliographic Details
Main Authors: Shawn D. Pethel, Daniel W. Hahs
Format: Article
Language:English
Published: MDPI AG 2014-05-01
Series:Entropy
Subjects:
Online Access:http://www.mdpi.com/1099-4300/16/5/2839
Description
Summary:Using a recently discovered method for producing random symbol sequences with prescribed transition counts, we present an exact null hypothesis significance test (NHST) for mutual information between two random variables, the null hypothesis being that the mutual information is zero (i.e., independence). The exact tests reported in the literature assume that data samples for each variable are sequentially independent and identically distributed (iid). In general, time series data have dependencies (Markov structure) that violate this condition. The algorithm given in this paper is the first exact significance test of mutual information that takes into account the Markov structure. When the Markov order is not known or indefinite, an exact test is used to determine an effective Markov order.
ISSN:1099-4300