Complexity as Causal Information Integration
Complexity measures in the context of the Integrated Information Theory of consciousness try to quantify the strength of the causal connections between different neurons. This is done by minimizing the KL-divergence between a full system and one without causal cross-connections. Various measures hav...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2020-09-01
|
Series: | Entropy |
Subjects: | |
Online Access: | https://www.mdpi.com/1099-4300/22/10/1107 |
_version_ | 1797552188080259072 |
---|---|
author | Carlotta Langer Nihat Ay |
author_facet | Carlotta Langer Nihat Ay |
author_sort | Carlotta Langer |
collection | DOAJ |
description | Complexity measures in the context of the Integrated Information Theory of consciousness try to quantify the strength of the causal connections between different neurons. This is done by minimizing the KL-divergence between a full system and one without causal cross-connections. Various measures have been proposed and compared in this setting. We will discuss a class of information geometric measures that aim at assessing the intrinsic causal cross-influences in a system. One promising candidate of these measures, denoted by <inline-formula><math display="inline"><semantics><msub><mi mathvariant="sans-serif">Φ</mi><mrow><mi>C</mi><mi>I</mi><mi>S</mi></mrow></msub></semantics></math></inline-formula>, is based on conditional independence statements and does satisfy all of the properties that have been postulated as desirable. Unfortunately it does not have a graphical representation, which makes it less intuitive and difficult to analyze. We propose an alternative approach using a latent variable, which models a common exterior influence. This leads to a measure <inline-formula><math display="inline"><semantics><msub><mi mathvariant="sans-serif">Φ</mi><mrow><mi>C</mi><mi>I</mi><mi>I</mi></mrow></msub></semantics></math></inline-formula>, Causal Information Integration, that satisfies all of the required conditions. Our measure can be calculated using an iterative information geometric algorithm, the em-algorithm. Therefore we are able to compare its behavior to existing integrated information measures. |
first_indexed | 2024-03-10T15:56:27Z |
format | Article |
id | doaj.art-fcbb7b09addd4dc1adf16f3a8d39ce3d |
institution | Directory Open Access Journal |
issn | 1099-4300 |
language | English |
last_indexed | 2024-03-10T15:56:27Z |
publishDate | 2020-09-01 |
publisher | MDPI AG |
record_format | Article |
series | Entropy |
spelling | doaj.art-fcbb7b09addd4dc1adf16f3a8d39ce3d2023-11-20T15:40:50ZengMDPI AGEntropy1099-43002020-09-012210110710.3390/e22101107Complexity as Causal Information IntegrationCarlotta Langer0Nihat Ay1Max Planck Institute for Mathematics in the Sciences, 04103 Leipzig, GermanyMax Planck Institute for Mathematics in the Sciences, 04103 Leipzig, GermanyComplexity measures in the context of the Integrated Information Theory of consciousness try to quantify the strength of the causal connections between different neurons. This is done by minimizing the KL-divergence between a full system and one without causal cross-connections. Various measures have been proposed and compared in this setting. We will discuss a class of information geometric measures that aim at assessing the intrinsic causal cross-influences in a system. One promising candidate of these measures, denoted by <inline-formula><math display="inline"><semantics><msub><mi mathvariant="sans-serif">Φ</mi><mrow><mi>C</mi><mi>I</mi><mi>S</mi></mrow></msub></semantics></math></inline-formula>, is based on conditional independence statements and does satisfy all of the properties that have been postulated as desirable. Unfortunately it does not have a graphical representation, which makes it less intuitive and difficult to analyze. We propose an alternative approach using a latent variable, which models a common exterior influence. This leads to a measure <inline-formula><math display="inline"><semantics><msub><mi mathvariant="sans-serif">Φ</mi><mrow><mi>C</mi><mi>I</mi><mi>I</mi></mrow></msub></semantics></math></inline-formula>, Causal Information Integration, that satisfies all of the required conditions. Our measure can be calculated using an iterative information geometric algorithm, the em-algorithm. Therefore we are able to compare its behavior to existing integrated information measures.https://www.mdpi.com/1099-4300/22/10/1107complexityintegrated informationcausalityconditional independenceem-algorithm |
spellingShingle | Carlotta Langer Nihat Ay Complexity as Causal Information Integration Entropy complexity integrated information causality conditional independence em-algorithm |
title | Complexity as Causal Information Integration |
title_full | Complexity as Causal Information Integration |
title_fullStr | Complexity as Causal Information Integration |
title_full_unstemmed | Complexity as Causal Information Integration |
title_short | Complexity as Causal Information Integration |
title_sort | complexity as causal information integration |
topic | complexity integrated information causality conditional independence em-algorithm |
url | https://www.mdpi.com/1099-4300/22/10/1107 |
work_keys_str_mv | AT carlottalanger complexityascausalinformationintegration AT nihatay complexityascausalinformationintegration |