On the expressivity of recurrent neural cascades
Recurrent Neural Cascades (RNCs) are the recurrent neural networks with no cyclic dependencies among recurrent neurons. This class of recurrent networks has received a lot of attention in practice. Besides training methods for a fixed architecture such as backpropagation, the cascade architecture na...
Main Authors: | , |
---|---|
Format: | Conference item |
Language: | English |
Published: |
Association for the Advancement of Artificial Intelligence
2024
|
_version_ | 1811141138603573248 |
---|---|
author | Knorozova, NA Ronca, A |
author_facet | Knorozova, NA Ronca, A |
author_sort | Knorozova, NA |
collection | OXFORD |
description | Recurrent Neural Cascades (RNCs) are the recurrent neural networks with no cyclic dependencies among recurrent neurons. This class of recurrent networks has received a lot of attention in practice. Besides training methods for a fixed architecture such as backpropagation, the cascade architecture naturally allows for constructive learning methods, where recurrent nodes are added incrementally one at a time, often yielding smaller networks. Furthermore, acyclicity amounts to a structural prior that even for the same number of neurons yields a more favourable sample complexity compared to a fully-connected architecture. A central question is whether the advantages of the cascade architecture come at the cost of a reduced expressivity. We provide new insights into this question. We show that the regular languages captured by RNCs with sign and tanh activation with positive recurrent weights are the star-free regular languages. In order to establish our results we develop a novel framework where capabilities of RNCs are assessed by analysing which semigroups and groups a single neuron is able to implement. A notable implication of our framework is that RNCs can achieve the expressivity of all regular languages by introducing neurons that can implement groups. |
first_indexed | 2024-09-25T04:33:07Z |
format | Conference item |
id | oxford-uuid:33afdedd-c69e-4ef9-9a20-524a2abd60a4 |
institution | University of Oxford |
language | English |
last_indexed | 2024-09-25T04:33:07Z |
publishDate | 2024 |
publisher | Association for the Advancement of Artificial Intelligence |
record_format | dspace |
spelling | oxford-uuid:33afdedd-c69e-4ef9-9a20-524a2abd60a42024-09-05T16:01:04ZOn the expressivity of recurrent neural cascadesConference itemhttp://purl.org/coar/resource_type/c_5794uuid:33afdedd-c69e-4ef9-9a20-524a2abd60a4EnglishSymplectic ElementsAssociation for the Advancement of Artificial Intelligence2024Knorozova, NARonca, ARecurrent Neural Cascades (RNCs) are the recurrent neural networks with no cyclic dependencies among recurrent neurons. This class of recurrent networks has received a lot of attention in practice. Besides training methods for a fixed architecture such as backpropagation, the cascade architecture naturally allows for constructive learning methods, where recurrent nodes are added incrementally one at a time, often yielding smaller networks. Furthermore, acyclicity amounts to a structural prior that even for the same number of neurons yields a more favourable sample complexity compared to a fully-connected architecture. A central question is whether the advantages of the cascade architecture come at the cost of a reduced expressivity. We provide new insights into this question. We show that the regular languages captured by RNCs with sign and tanh activation with positive recurrent weights are the star-free regular languages. In order to establish our results we develop a novel framework where capabilities of RNCs are assessed by analysing which semigroups and groups a single neuron is able to implement. A notable implication of our framework is that RNCs can achieve the expressivity of all regular languages by introducing neurons that can implement groups. |
spellingShingle | Knorozova, NA Ronca, A On the expressivity of recurrent neural cascades |
title | On the expressivity of recurrent neural cascades |
title_full | On the expressivity of recurrent neural cascades |
title_fullStr | On the expressivity of recurrent neural cascades |
title_full_unstemmed | On the expressivity of recurrent neural cascades |
title_short | On the expressivity of recurrent neural cascades |
title_sort | on the expressivity of recurrent neural cascades |
work_keys_str_mv | AT knorozovana ontheexpressivityofrecurrentneuralcascades AT roncaa ontheexpressivityofrecurrentneuralcascades |