Toward a Lossless Conversion for Spiking Neural Networks with Negative‐Spike Dynamics

Spiking neural networks (SNNs) become popular choices for processing spatiotemporal input data and enabling low‐power event‐driven spike computation on neuromorphic processors. However, direct SNN training algorithms are not well compatible with error back‐propagation process, while indirect convers...

Full description

Bibliographic Details
Main Authors: Chenglong Zou, Xiaoxin Cui, Guang Chen, Yuanyuan Jiang, Yuan Wang
Format: Article
Language:English
Published: Wiley 2023-12-01
Series:Advanced Intelligent Systems
Subjects:
Online Access:https://doi.org/10.1002/aisy.202300383
Description
Summary:Spiking neural networks (SNNs) become popular choices for processing spatiotemporal input data and enabling low‐power event‐driven spike computation on neuromorphic processors. However, direct SNN training algorithms are not well compatible with error back‐propagation process, while indirect conversion algorithms based on artificial neural networks (ANNs) are usually accuracy–lossy due to various approximation errors. Both of them suffer from lower accuracies compared with their reference ANNs and need lots of time steps to achieve stable performance in deep architectures. In this article, a novel conversion framework is presented for deep SNNs with negative‐spike dynamics, which takes a quantization constraint and spike compensation technique into consideration during ANN‐to‐SNN conversion, and a truly lossless accuracy performance with their ANN counterparts is obtained. The converted SNNs can retain full advantages of simple leaky‐integrate‐and‐fire spiking neurons and are very suited for hardware implementation. In the experimental results, it is shown that converted spiking LeNet on MNIST/FashionMNIST and VGG‐Net on CIFAR‐10 dataset yield the state‐of‐the‐art classification accuracies with quite shortened computing time steps and much fewer synaptic operations.
ISSN:2640-4567