Generic predictions of output probability based on complexities of inputs and outputs

For a broad class of input-output maps, arguments based on the coding theorem from algorithmic information theory (AIT) predict that simple (low Kolmogorov complexity) outputs are exponentially more likely to occur upon uniform random sampling of inputs than complex outputs are. Here, we derive prob...

Full description

Bibliographic Details
Main Authors: Dingle, K, Pérez, GV, Louis, AA
Format: Journal article
Language:English
Published: Nature Research 2020
_version_ 1826266583922638848
author Dingle, K
Pérez, GV
Louis, AA
author_facet Dingle, K
Pérez, GV
Louis, AA
author_sort Dingle, K
collection OXFORD
description For a broad class of input-output maps, arguments based on the coding theorem from algorithmic information theory (AIT) predict that simple (low Kolmogorov complexity) outputs are exponentially more likely to occur upon uniform random sampling of inputs than complex outputs are. Here, we derive probability bounds that are based on the complexities of the inputs as well as the outputs, rather than just on the complexities of the outputs. The more that outputs deviate from the coding theorem bound, the lower the complexity of their inputs. Since the number of low complexity inputs is limited, this behaviour leads to an effective lower bound on the probability. Our new bounds are tested for an RNA sequence to structure map, a finite state transducer and a perceptron. The success of these new methods opens avenues for AIT to be more widely used.
first_indexed 2024-03-06T20:41:10Z
format Journal article
id oxford-uuid:3455663a-a80b-4f27-bdfe-f526bb30d0b2
institution University of Oxford
language English
last_indexed 2024-03-06T20:41:10Z
publishDate 2020
publisher Nature Research
record_format dspace
spelling oxford-uuid:3455663a-a80b-4f27-bdfe-f526bb30d0b22022-03-26T13:25:21ZGeneric predictions of output probability based on complexities of inputs and outputsJournal articlehttp://purl.org/coar/resource_type/c_dcae04bcuuid:3455663a-a80b-4f27-bdfe-f526bb30d0b2EnglishSymplectic ElementsNature Research2020Dingle, KPérez, GVLouis, AAFor a broad class of input-output maps, arguments based on the coding theorem from algorithmic information theory (AIT) predict that simple (low Kolmogorov complexity) outputs are exponentially more likely to occur upon uniform random sampling of inputs than complex outputs are. Here, we derive probability bounds that are based on the complexities of the inputs as well as the outputs, rather than just on the complexities of the outputs. The more that outputs deviate from the coding theorem bound, the lower the complexity of their inputs. Since the number of low complexity inputs is limited, this behaviour leads to an effective lower bound on the probability. Our new bounds are tested for an RNA sequence to structure map, a finite state transducer and a perceptron. The success of these new methods opens avenues for AIT to be more widely used.
spellingShingle Dingle, K
Pérez, GV
Louis, AA
Generic predictions of output probability based on complexities of inputs and outputs
title Generic predictions of output probability based on complexities of inputs and outputs
title_full Generic predictions of output probability based on complexities of inputs and outputs
title_fullStr Generic predictions of output probability based on complexities of inputs and outputs
title_full_unstemmed Generic predictions of output probability based on complexities of inputs and outputs
title_short Generic predictions of output probability based on complexities of inputs and outputs
title_sort generic predictions of output probability based on complexities of inputs and outputs
work_keys_str_mv AT dinglek genericpredictionsofoutputprobabilitybasedoncomplexitiesofinputsandoutputs
AT perezgv genericpredictionsofoutputprobabilitybasedoncomplexitiesofinputsandoutputs
AT louisaa genericpredictionsofoutputprobabilitybasedoncomplexitiesofinputsandoutputs