Toward Scalable, Efficient, and Accurate Deep Spiking Neural Networks With Backward Residual Connections, Stochastic Softmax, and Hybridization

Spiking Neural Networks (SNNs) may offer an energy-efficient alternative for implementing deep learning applications. In recent years, there have been several proposals focused on supervised (conversion, spike-based gradient descent) and unsupervised (spike timing dependent plasticity) training meth...

Full description

Bibliographic Details
Main Authors: Priyadarshini Panda, Sai Aparna Aketi, Kaushik Roy
Format: Article
Language:English
Published: Frontiers Media S.A. 2020-06-01
Series:Frontiers in Neuroscience
Subjects:
Online Access:https://www.frontiersin.org/article/10.3389/fnins.2020.00653/full
_version_ 1811310353267556352
author Priyadarshini Panda
Sai Aparna Aketi
Kaushik Roy
author_facet Priyadarshini Panda
Sai Aparna Aketi
Kaushik Roy
author_sort Priyadarshini Panda
collection DOAJ
description Spiking Neural Networks (SNNs) may offer an energy-efficient alternative for implementing deep learning applications. In recent years, there have been several proposals focused on supervised (conversion, spike-based gradient descent) and unsupervised (spike timing dependent plasticity) training methods to improve the accuracy of SNNs on large-scale tasks. However, each of these methods suffer from scalability, latency, and accuracy limitations. In this paper, we propose novel algorithmic techniques of modifying the SNN configuration with backward residual connections, stochastic softmax, and hybrid artificial-and-spiking neuronal activations to improve the learning ability of the training methodologies to yield competitive accuracy, while, yielding large efficiency gains over their artificial counterparts. Note, artificial counterparts refer to conventional deep learning/artificial neural networks. Our techniques apply to VGG/Residual architectures, and are compatible with all forms of training methodologies. Our analysis reveals that the proposed solutions yield near state-of-the-art accuracy with significant energy-efficiency and reduced parameter overhead translating to hardware improvements on complex visual recognition tasks, such as, CIFAR10, Imagenet datatsets.
first_indexed 2024-04-13T09:57:29Z
format Article
id doaj.art-54e29c8a75424bef866d89cce34ed123
institution Directory Open Access Journal
issn 1662-453X
language English
last_indexed 2024-04-13T09:57:29Z
publishDate 2020-06-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Neuroscience
spelling doaj.art-54e29c8a75424bef866d89cce34ed1232022-12-22T02:51:19ZengFrontiers Media S.A.Frontiers in Neuroscience1662-453X2020-06-011410.3389/fnins.2020.00653535502Toward Scalable, Efficient, and Accurate Deep Spiking Neural Networks With Backward Residual Connections, Stochastic Softmax, and HybridizationPriyadarshini Panda0Sai Aparna Aketi1Kaushik Roy2Department of Electrical Engineering, Yale University, New Haven, CT, United StatesSchool of Electrical and Computer Engineering, Purdue University, West Lafayette, IN, United StatesSchool of Electrical and Computer Engineering, Purdue University, West Lafayette, IN, United StatesSpiking Neural Networks (SNNs) may offer an energy-efficient alternative for implementing deep learning applications. In recent years, there have been several proposals focused on supervised (conversion, spike-based gradient descent) and unsupervised (spike timing dependent plasticity) training methods to improve the accuracy of SNNs on large-scale tasks. However, each of these methods suffer from scalability, latency, and accuracy limitations. In this paper, we propose novel algorithmic techniques of modifying the SNN configuration with backward residual connections, stochastic softmax, and hybrid artificial-and-spiking neuronal activations to improve the learning ability of the training methodologies to yield competitive accuracy, while, yielding large efficiency gains over their artificial counterparts. Note, artificial counterparts refer to conventional deep learning/artificial neural networks. Our techniques apply to VGG/Residual architectures, and are compatible with all forms of training methodologies. Our analysis reveals that the proposed solutions yield near state-of-the-art accuracy with significant energy-efficiency and reduced parameter overhead translating to hardware improvements on complex visual recognition tasks, such as, CIFAR10, Imagenet datatsets.https://www.frontiersin.org/article/10.3389/fnins.2020.00653/fullspiking neural networksenergy-efficiencybackward residual connectionstochastic softmaxhybridizationimproved accuracy
spellingShingle Priyadarshini Panda
Sai Aparna Aketi
Kaushik Roy
Toward Scalable, Efficient, and Accurate Deep Spiking Neural Networks With Backward Residual Connections, Stochastic Softmax, and Hybridization
Frontiers in Neuroscience
spiking neural networks
energy-efficiency
backward residual connection
stochastic softmax
hybridization
improved accuracy
title Toward Scalable, Efficient, and Accurate Deep Spiking Neural Networks With Backward Residual Connections, Stochastic Softmax, and Hybridization
title_full Toward Scalable, Efficient, and Accurate Deep Spiking Neural Networks With Backward Residual Connections, Stochastic Softmax, and Hybridization
title_fullStr Toward Scalable, Efficient, and Accurate Deep Spiking Neural Networks With Backward Residual Connections, Stochastic Softmax, and Hybridization
title_full_unstemmed Toward Scalable, Efficient, and Accurate Deep Spiking Neural Networks With Backward Residual Connections, Stochastic Softmax, and Hybridization
title_short Toward Scalable, Efficient, and Accurate Deep Spiking Neural Networks With Backward Residual Connections, Stochastic Softmax, and Hybridization
title_sort toward scalable efficient and accurate deep spiking neural networks with backward residual connections stochastic softmax and hybridization
topic spiking neural networks
energy-efficiency
backward residual connection
stochastic softmax
hybridization
improved accuracy
url https://www.frontiersin.org/article/10.3389/fnins.2020.00653/full
work_keys_str_mv AT priyadarshinipanda towardscalableefficientandaccuratedeepspikingneuralnetworkswithbackwardresidualconnectionsstochasticsoftmaxandhybridization
AT saiaparnaaketi towardscalableefficientandaccuratedeepspikingneuralnetworkswithbackwardresidualconnectionsstochasticsoftmaxandhybridization
AT kaushikroy towardscalableefficientandaccuratedeepspikingneuralnetworkswithbackwardresidualconnectionsstochasticsoftmaxandhybridization