Efficient dendritic learning as an alternative to synaptic plasticity hypothesis
Abstract Synaptic plasticity is a long-lasting core hypothesis of brain learning that suggests local adaptation between two connecting neurons and forms the foundation of machine learning. The main complexity of synaptic plasticity is that synapses and dendrites connect neurons in series and existin...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Nature Portfolio
2022-04-01
|
Series: | Scientific Reports |
Online Access: | https://doi.org/10.1038/s41598-022-10466-8 |
_version_ | 1828262356521582592 |
---|---|
author | Shiri Hodassman Roni Vardi Yael Tugendhaft Amir Goldental Ido Kanter |
author_facet | Shiri Hodassman Roni Vardi Yael Tugendhaft Amir Goldental Ido Kanter |
author_sort | Shiri Hodassman |
collection | DOAJ |
description | Abstract Synaptic plasticity is a long-lasting core hypothesis of brain learning that suggests local adaptation between two connecting neurons and forms the foundation of machine learning. The main complexity of synaptic plasticity is that synapses and dendrites connect neurons in series and existing experiments cannot pinpoint the significant imprinted adaptation location. We showed efficient backpropagation and Hebbian learning on dendritic trees, inspired by experimental-based evidence, for sub-dendritic adaptation and its nonlinear amplification. It has proven to achieve success rates approaching unity for handwritten digits recognition, indicating realization of deep learning even by a single dendrite or neuron. Additionally, dendritic amplification practically generates an exponential number of input crosses, higher-order interactions, with the number of inputs, which enhance success rates. However, direct implementation of a large number of the cross weights and their exhaustive manipulation independently is beyond existing and anticipated computational power. Hence, a new type of nonlinear adaptive dendritic hardware for imitating dendritic learning and estimating the computational capability of the brain must be built. |
first_indexed | 2024-04-13T03:54:31Z |
format | Article |
id | doaj.art-415b2555da004e2e96a94405fa2f0b0d |
institution | Directory Open Access Journal |
issn | 2045-2322 |
language | English |
last_indexed | 2024-04-13T03:54:31Z |
publishDate | 2022-04-01 |
publisher | Nature Portfolio |
record_format | Article |
series | Scientific Reports |
spelling | doaj.art-415b2555da004e2e96a94405fa2f0b0d2022-12-22T03:03:41ZengNature PortfolioScientific Reports2045-23222022-04-0112111210.1038/s41598-022-10466-8Efficient dendritic learning as an alternative to synaptic plasticity hypothesisShiri Hodassman0Roni Vardi1Yael Tugendhaft2Amir Goldental3Ido Kanter4Department of Physics, Bar-Ilan UniversityGonda Interdisciplinary Brain Research Center, Bar-Ilan UniversityDepartment of Physics, Bar-Ilan UniversityDepartment of Physics, Bar-Ilan UniversityDepartment of Physics, Bar-Ilan UniversityAbstract Synaptic plasticity is a long-lasting core hypothesis of brain learning that suggests local adaptation between two connecting neurons and forms the foundation of machine learning. The main complexity of synaptic plasticity is that synapses and dendrites connect neurons in series and existing experiments cannot pinpoint the significant imprinted adaptation location. We showed efficient backpropagation and Hebbian learning on dendritic trees, inspired by experimental-based evidence, for sub-dendritic adaptation and its nonlinear amplification. It has proven to achieve success rates approaching unity for handwritten digits recognition, indicating realization of deep learning even by a single dendrite or neuron. Additionally, dendritic amplification practically generates an exponential number of input crosses, higher-order interactions, with the number of inputs, which enhance success rates. However, direct implementation of a large number of the cross weights and their exhaustive manipulation independently is beyond existing and anticipated computational power. Hence, a new type of nonlinear adaptive dendritic hardware for imitating dendritic learning and estimating the computational capability of the brain must be built.https://doi.org/10.1038/s41598-022-10466-8 |
spellingShingle | Shiri Hodassman Roni Vardi Yael Tugendhaft Amir Goldental Ido Kanter Efficient dendritic learning as an alternative to synaptic plasticity hypothesis Scientific Reports |
title | Efficient dendritic learning as an alternative to synaptic plasticity hypothesis |
title_full | Efficient dendritic learning as an alternative to synaptic plasticity hypothesis |
title_fullStr | Efficient dendritic learning as an alternative to synaptic plasticity hypothesis |
title_full_unstemmed | Efficient dendritic learning as an alternative to synaptic plasticity hypothesis |
title_short | Efficient dendritic learning as an alternative to synaptic plasticity hypothesis |
title_sort | efficient dendritic learning as an alternative to synaptic plasticity hypothesis |
url | https://doi.org/10.1038/s41598-022-10466-8 |
work_keys_str_mv | AT shirihodassman efficientdendriticlearningasanalternativetosynapticplasticityhypothesis AT ronivardi efficientdendriticlearningasanalternativetosynapticplasticityhypothesis AT yaeltugendhaft efficientdendriticlearningasanalternativetosynapticplasticityhypothesis AT amirgoldental efficientdendriticlearningasanalternativetosynapticplasticityhypothesis AT idokanter efficientdendriticlearningasanalternativetosynapticplasticityhypothesis |