A Quick and Easy Way to Estimate Entropy and Mutual Information for Neuroscience
Calculations of entropy of a signal or mutual information between two variables are valuable analytical tools in the field of neuroscience. They can be applied to all types of data, capture non-linear interactions and are model independent. Yet the limited size and number of recordings one can colle...
Main Authors: | Mickael Zbili, Sylvain Rama |
---|---|
Format: | Article |
Language: | English |
Published: |
Frontiers Media S.A.
2021-06-01
|
Series: | Frontiers in Neuroinformatics |
Subjects: | |
Online Access: | https://www.frontiersin.org/articles/10.3389/fninf.2021.596443/full |
Similar Items
-
Prototipe Kompresi Lossless Audio Codec Menggunakan Entropy Encoding
by: Andreas Soegandi
Published: (2010-12-01) -
Lossless Compression of Plenoptic Camera Sensor Images
by: Ioan Tabus, et al.
Published: (2021-01-01) -
Autonomous Parameter Adjustment Method for Lossless Data Compression on Adaptive Stream-Based Entropy Coding
by: Shinichi Yamagiwa, et al.
Published: (2020-01-01) -
High-Throughput FPGA-Based Hardware Accelerators for Deflate Compression and Decompression Using High-Level Synthesis
by: Morgan Ledwon, et al.
Published: (2020-01-01) -
A Grayscale Semi-Lossless Image Compression Technique Using RLE
by: Rafeeq Al-hashemi, et al.
Published: (2011-01-01)