Backpropagation With Sparsity Regularization for Spiking Neural Network Learning

The spiking neural network (SNN) is a possible pathway for low-power and energy-efficient processing and computing exploiting spiking-driven and sparsity features of biological systems. This article proposes a sparsity-driven SNN learning algorithm, namely backpropagation with sparsity regularizatio...

Full description

Bibliographic Details
Main Authors: Yulong Yan, Haoming Chu, Yi Jin, Yuxiang Huan, Zhuo Zou, Lirong Zheng
Format: Article
Language:English
Published: Frontiers Media S.A. 2022-04-01
Series:Frontiers in Neuroscience
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/fnins.2022.760298/full
_version_ 1818255905672658944
author Yulong Yan
Haoming Chu
Yi Jin
Yuxiang Huan
Zhuo Zou
Lirong Zheng
author_facet Yulong Yan
Haoming Chu
Yi Jin
Yuxiang Huan
Zhuo Zou
Lirong Zheng
author_sort Yulong Yan
collection DOAJ
description The spiking neural network (SNN) is a possible pathway for low-power and energy-efficient processing and computing exploiting spiking-driven and sparsity features of biological systems. This article proposes a sparsity-driven SNN learning algorithm, namely backpropagation with sparsity regularization (BPSR), aiming to achieve improved spiking and synaptic sparsity. Backpropagation incorporating spiking regularization is utilized to minimize the spiking firing rate with guaranteed accuracy. Backpropagation realizes the temporal information capture and extends to the spiking recurrent layer to support brain-like structure learning. The rewiring mechanism with synaptic regularization is suggested to further mitigate the redundancy of the network structure. Rewiring based on weight and gradient regulates the pruning and growth of synapses. Experimental results demonstrate that the network learned by BPSR has synaptic sparsity and is highly similar to the biological system. It not only balances the accuracy and firing rate, but also facilitates SNN learning by suppressing the information redundancy. We evaluate the proposed BPSR on the visual dataset MNIST, N-MNIST, and CIFAR10, and further test it on the sensor dataset MIT-BIH and gas sensor. Results bespeak that our algorithm achieves comparable or superior accuracy compared to related works, with sparse spikes and synapses.
first_indexed 2024-12-12T17:19:17Z
format Article
id doaj.art-4d4e335090f842d5858b5cd9e45ae930
institution Directory Open Access Journal
issn 1662-453X
language English
last_indexed 2024-12-12T17:19:17Z
publishDate 2022-04-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Neuroscience
spelling doaj.art-4d4e335090f842d5858b5cd9e45ae9302022-12-22T00:17:42ZengFrontiers Media S.A.Frontiers in Neuroscience1662-453X2022-04-011610.3389/fnins.2022.760298760298Backpropagation With Sparsity Regularization for Spiking Neural Network LearningYulong YanHaoming ChuYi JinYuxiang HuanZhuo ZouLirong ZhengThe spiking neural network (SNN) is a possible pathway for low-power and energy-efficient processing and computing exploiting spiking-driven and sparsity features of biological systems. This article proposes a sparsity-driven SNN learning algorithm, namely backpropagation with sparsity regularization (BPSR), aiming to achieve improved spiking and synaptic sparsity. Backpropagation incorporating spiking regularization is utilized to minimize the spiking firing rate with guaranteed accuracy. Backpropagation realizes the temporal information capture and extends to the spiking recurrent layer to support brain-like structure learning. The rewiring mechanism with synaptic regularization is suggested to further mitigate the redundancy of the network structure. Rewiring based on weight and gradient regulates the pruning and growth of synapses. Experimental results demonstrate that the network learned by BPSR has synaptic sparsity and is highly similar to the biological system. It not only balances the accuracy and firing rate, but also facilitates SNN learning by suppressing the information redundancy. We evaluate the proposed BPSR on the visual dataset MNIST, N-MNIST, and CIFAR10, and further test it on the sensor dataset MIT-BIH and gas sensor. Results bespeak that our algorithm achieves comparable or superior accuracy compared to related works, with sparse spikes and synapses.https://www.frontiersin.org/articles/10.3389/fnins.2022.760298/fullspiking neural networkbackpropagationsparsity regularizationspiking sparsitysynaptic sparsity
spellingShingle Yulong Yan
Haoming Chu
Yi Jin
Yuxiang Huan
Zhuo Zou
Lirong Zheng
Backpropagation With Sparsity Regularization for Spiking Neural Network Learning
Frontiers in Neuroscience
spiking neural network
backpropagation
sparsity regularization
spiking sparsity
synaptic sparsity
title Backpropagation With Sparsity Regularization for Spiking Neural Network Learning
title_full Backpropagation With Sparsity Regularization for Spiking Neural Network Learning
title_fullStr Backpropagation With Sparsity Regularization for Spiking Neural Network Learning
title_full_unstemmed Backpropagation With Sparsity Regularization for Spiking Neural Network Learning
title_short Backpropagation With Sparsity Regularization for Spiking Neural Network Learning
title_sort backpropagation with sparsity regularization for spiking neural network learning
topic spiking neural network
backpropagation
sparsity regularization
spiking sparsity
synaptic sparsity
url https://www.frontiersin.org/articles/10.3389/fnins.2022.760298/full
work_keys_str_mv AT yulongyan backpropagationwithsparsityregularizationforspikingneuralnetworklearning
AT haomingchu backpropagationwithsparsityregularizationforspikingneuralnetworklearning
AT yijin backpropagationwithsparsityregularizationforspikingneuralnetworklearning
AT yuxianghuan backpropagationwithsparsityregularizationforspikingneuralnetworklearning
AT zhuozou backpropagationwithsparsityregularizationforspikingneuralnetworklearning
AT lirongzheng backpropagationwithsparsityregularizationforspikingneuralnetworklearning