A Method to Present and Analyze Ensembles of Information Sources

Information theory is a powerful tool for analyzing complex systems. In many areas of neuroscience, it is now possible to gather data from large ensembles of neural variables (e.g., data from many neurons, genes, or voxels). The individual variables can be analyzed with information theory to provide...

Full description

Bibliographic Details
Main Authors: Nicholas M. Timme, David Linsenbardt, Christopher C. Lapish
Format: Article
Language:English
Published: MDPI AG 2020-05-01
Series:Entropy
Subjects:
Online Access:https://www.mdpi.com/1099-4300/22/5/580
_version_ 1797567461441142784
author Nicholas M. Timme
David Linsenbardt
Christopher C. Lapish
author_facet Nicholas M. Timme
David Linsenbardt
Christopher C. Lapish
author_sort Nicholas M. Timme
collection DOAJ
description Information theory is a powerful tool for analyzing complex systems. In many areas of neuroscience, it is now possible to gather data from large ensembles of neural variables (e.g., data from many neurons, genes, or voxels). The individual variables can be analyzed with information theory to provide estimates of information shared between variables (forming a network between variables), or between neural variables and other variables (e.g., behavior or sensory stimuli). However, it can be difficult to (1) evaluate if the ensemble is significantly different from what would be expected in a purely noisy system and (2) determine if two ensembles are different. Herein, we introduce relatively simple methods to address these problems by analyzing ensembles of information sources. We demonstrate how an ensemble built of mutual information connections can be compared to null surrogate data to determine if the ensemble is significantly different from noise. Next, we show how two ensembles can be compared using a randomization process to determine if the sources in one contain more information than the other. All code necessary to carry out these analyses and demonstrations are provided.
first_indexed 2024-03-10T19:42:18Z
format Article
id doaj.art-e9639105a4e148fb83a2e35b4ba18676
institution Directory Open Access Journal
issn 1099-4300
language English
last_indexed 2024-03-10T19:42:18Z
publishDate 2020-05-01
publisher MDPI AG
record_format Article
series Entropy
spelling doaj.art-e9639105a4e148fb83a2e35b4ba186762023-11-20T01:11:20ZengMDPI AGEntropy1099-43002020-05-0122558010.3390/e22050580A Method to Present and Analyze Ensembles of Information SourcesNicholas M. Timme0David Linsenbardt1Christopher C. Lapish2Department of Psychology, Indiana University—Purdue University Indianapolis, Indianapolis, IN 46202, USADepartment of Neurosciences, University of New Mexico School of Medicine, Albuquerque, NM 87131, USADepartment of Psychology, Indiana University—Purdue University Indianapolis, Indianapolis, IN 46202, USAInformation theory is a powerful tool for analyzing complex systems. In many areas of neuroscience, it is now possible to gather data from large ensembles of neural variables (e.g., data from many neurons, genes, or voxels). The individual variables can be analyzed with information theory to provide estimates of information shared between variables (forming a network between variables), or between neural variables and other variables (e.g., behavior or sensory stimuli). However, it can be difficult to (1) evaluate if the ensemble is significantly different from what would be expected in a purely noisy system and (2) determine if two ensembles are different. Herein, we introduce relatively simple methods to address these problems by analyzing ensembles of information sources. We demonstrate how an ensemble built of mutual information connections can be compared to null surrogate data to determine if the ensemble is significantly different from noise. Next, we show how two ensembles can be compared using a randomization process to determine if the sources in one contain more information than the other. All code necessary to carry out these analyses and demonstrations are provided.https://www.mdpi.com/1099-4300/22/5/580information theoryinformation ensembleensemble comparisonpopulation codingmutual informationneural ensemble
spellingShingle Nicholas M. Timme
David Linsenbardt
Christopher C. Lapish
A Method to Present and Analyze Ensembles of Information Sources
Entropy
information theory
information ensemble
ensemble comparison
population coding
mutual information
neural ensemble
title A Method to Present and Analyze Ensembles of Information Sources
title_full A Method to Present and Analyze Ensembles of Information Sources
title_fullStr A Method to Present and Analyze Ensembles of Information Sources
title_full_unstemmed A Method to Present and Analyze Ensembles of Information Sources
title_short A Method to Present and Analyze Ensembles of Information Sources
title_sort method to present and analyze ensembles of information sources
topic information theory
information ensemble
ensemble comparison
population coding
mutual information
neural ensemble
url https://www.mdpi.com/1099-4300/22/5/580
work_keys_str_mv AT nicholasmtimme amethodtopresentandanalyzeensemblesofinformationsources
AT davidlinsenbardt amethodtopresentandanalyzeensemblesofinformationsources
AT christopherclapish amethodtopresentandanalyzeensemblesofinformationsources
AT nicholasmtimme methodtopresentandanalyzeensemblesofinformationsources
AT davidlinsenbardt methodtopresentandanalyzeensemblesofinformationsources
AT christopherclapish methodtopresentandanalyzeensemblesofinformationsources