Perceptron Learning and Classification in a Modeled Cortical Pyramidal Cell

The perceptron learning algorithm and its multiple-layer extension, the backpropagation algorithm, are the foundations of the present-day machine learning revolution. However, these algorithms utilize a highly simplified mathematical abstraction of a neuron; it is not clear to what extent real bioph...

Full description

Bibliographic Details
Main Authors: Toviah Moldwin, Idan Segev
Format: Article
Language:English
Published: Frontiers Media S.A. 2020-04-01
Series:Frontiers in Computational Neuroscience
Subjects:
Online Access:https://www.frontiersin.org/article/10.3389/fncom.2020.00033/full
_version_ 1828765225062498304
author Toviah Moldwin
Idan Segev
Idan Segev
author_facet Toviah Moldwin
Idan Segev
Idan Segev
author_sort Toviah Moldwin
collection DOAJ
description The perceptron learning algorithm and its multiple-layer extension, the backpropagation algorithm, are the foundations of the present-day machine learning revolution. However, these algorithms utilize a highly simplified mathematical abstraction of a neuron; it is not clear to what extent real biophysical neurons with morphologically-extended non-linear dendritic trees and conductance-based synapses can realize perceptron-like learning. Here we implemented the perceptron learning algorithm in a realistic biophysical model of a layer 5 cortical pyramidal cell with a full complement of non-linear dendritic channels. We tested this biophysical perceptron (BP) on a classification task, where it needed to correctly binarily classify 100, 1,000, or 2,000 patterns, and a generalization task, where it was required to discriminate between two “noisy” patterns. We show that the BP performs these tasks with an accuracy comparable to that of the original perceptron, though the classification capacity of the apical tuft is somewhat limited. We concluded that cortical pyramidal neurons can act as powerful classification devices.
first_indexed 2024-12-11T06:44:22Z
format Article
id doaj.art-64239bc2d22b4644ac8ba93d8e44032c
institution Directory Open Access Journal
issn 1662-5188
language English
last_indexed 2024-12-11T06:44:22Z
publishDate 2020-04-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Computational Neuroscience
spelling doaj.art-64239bc2d22b4644ac8ba93d8e44032c2022-12-22T01:17:07ZengFrontiers Media S.A.Frontiers in Computational Neuroscience1662-51882020-04-011410.3389/fncom.2020.00033500178Perceptron Learning and Classification in a Modeled Cortical Pyramidal CellToviah Moldwin0Idan Segev1Idan Segev2Edmond and Lily Safra Center for Brain Sciences, The Hebrew University of Jerusalem, Jerusalem, IsraelEdmond and Lily Safra Center for Brain Sciences, The Hebrew University of Jerusalem, Jerusalem, IsraelDepartment of Neurobiology, The Hebrew University of Jerusalem, Jerusalem, IsraelThe perceptron learning algorithm and its multiple-layer extension, the backpropagation algorithm, are the foundations of the present-day machine learning revolution. However, these algorithms utilize a highly simplified mathematical abstraction of a neuron; it is not clear to what extent real biophysical neurons with morphologically-extended non-linear dendritic trees and conductance-based synapses can realize perceptron-like learning. Here we implemented the perceptron learning algorithm in a realistic biophysical model of a layer 5 cortical pyramidal cell with a full complement of non-linear dendritic channels. We tested this biophysical perceptron (BP) on a classification task, where it needed to correctly binarily classify 100, 1,000, or 2,000 patterns, and a generalization task, where it was required to discriminate between two “noisy” patterns. We show that the BP performs these tasks with an accuracy comparable to that of the original perceptron, though the classification capacity of the apical tuft is somewhat limited. We concluded that cortical pyramidal neurons can act as powerful classification devices.https://www.frontiersin.org/article/10.3389/fncom.2020.00033/fullcompartmental modelingnon-linear dendritescortical excitatory synapsessingle neuron computationmachine learningsynaptic weights
spellingShingle Toviah Moldwin
Idan Segev
Idan Segev
Perceptron Learning and Classification in a Modeled Cortical Pyramidal Cell
Frontiers in Computational Neuroscience
compartmental modeling
non-linear dendrites
cortical excitatory synapses
single neuron computation
machine learning
synaptic weights
title Perceptron Learning and Classification in a Modeled Cortical Pyramidal Cell
title_full Perceptron Learning and Classification in a Modeled Cortical Pyramidal Cell
title_fullStr Perceptron Learning and Classification in a Modeled Cortical Pyramidal Cell
title_full_unstemmed Perceptron Learning and Classification in a Modeled Cortical Pyramidal Cell
title_short Perceptron Learning and Classification in a Modeled Cortical Pyramidal Cell
title_sort perceptron learning and classification in a modeled cortical pyramidal cell
topic compartmental modeling
non-linear dendrites
cortical excitatory synapses
single neuron computation
machine learning
synaptic weights
url https://www.frontiersin.org/article/10.3389/fncom.2020.00033/full
work_keys_str_mv AT toviahmoldwin perceptronlearningandclassificationinamodeledcorticalpyramidalcell
AT idansegev perceptronlearningandclassificationinamodeledcorticalpyramidalcell
AT idansegev perceptronlearningandclassificationinamodeledcorticalpyramidalcell