Toward a Lossless Conversion for Spiking Neural Networks with Negative‐Spike Dynamics
Spiking neural networks (SNNs) become popular choices for processing spatiotemporal input data and enabling low‐power event‐driven spike computation on neuromorphic processors. However, direct SNN training algorithms are not well compatible with error back‐propagation process, while indirect convers...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Wiley
2023-12-01
|
Series: | Advanced Intelligent Systems |
Subjects: | |
Online Access: | https://doi.org/10.1002/aisy.202300383 |
_version_ | 1797378745234882560 |
---|---|
author | Chenglong Zou Xiaoxin Cui Guang Chen Yuanyuan Jiang Yuan Wang |
author_facet | Chenglong Zou Xiaoxin Cui Guang Chen Yuanyuan Jiang Yuan Wang |
author_sort | Chenglong Zou |
collection | DOAJ |
description | Spiking neural networks (SNNs) become popular choices for processing spatiotemporal input data and enabling low‐power event‐driven spike computation on neuromorphic processors. However, direct SNN training algorithms are not well compatible with error back‐propagation process, while indirect conversion algorithms based on artificial neural networks (ANNs) are usually accuracy–lossy due to various approximation errors. Both of them suffer from lower accuracies compared with their reference ANNs and need lots of time steps to achieve stable performance in deep architectures. In this article, a novel conversion framework is presented for deep SNNs with negative‐spike dynamics, which takes a quantization constraint and spike compensation technique into consideration during ANN‐to‐SNN conversion, and a truly lossless accuracy performance with their ANN counterparts is obtained. The converted SNNs can retain full advantages of simple leaky‐integrate‐and‐fire spiking neurons and are very suited for hardware implementation. In the experimental results, it is shown that converted spiking LeNet on MNIST/FashionMNIST and VGG‐Net on CIFAR‐10 dataset yield the state‐of‐the‐art classification accuracies with quite shortened computing time steps and much fewer synaptic operations. |
first_indexed | 2024-03-08T20:12:08Z |
format | Article |
id | doaj.art-7dc5f238753141bfa280292a8e1682ab |
institution | Directory Open Access Journal |
issn | 2640-4567 |
language | English |
last_indexed | 2024-03-08T20:12:08Z |
publishDate | 2023-12-01 |
publisher | Wiley |
record_format | Article |
series | Advanced Intelligent Systems |
spelling | doaj.art-7dc5f238753141bfa280292a8e1682ab2023-12-23T04:53:50ZengWileyAdvanced Intelligent Systems2640-45672023-12-01512n/an/a10.1002/aisy.202300383Toward a Lossless Conversion for Spiking Neural Networks with Negative‐Spike DynamicsChenglong Zou0Xiaoxin Cui1Guang Chen2Yuanyuan Jiang3Yuan Wang4School of Mathematical Science Peking University Beijing 100871 ChinaSchool of Integrated Circuits Peking University Beijing 100871 ChinaSchool of Integrated Circuits Peking University Beijing 100871 ChinaSchool of Integrated Circuits Peking University Beijing 100871 ChinaSchool of Integrated Circuits Peking University Beijing 100871 ChinaSpiking neural networks (SNNs) become popular choices for processing spatiotemporal input data and enabling low‐power event‐driven spike computation on neuromorphic processors. However, direct SNN training algorithms are not well compatible with error back‐propagation process, while indirect conversion algorithms based on artificial neural networks (ANNs) are usually accuracy–lossy due to various approximation errors. Both of them suffer from lower accuracies compared with their reference ANNs and need lots of time steps to achieve stable performance in deep architectures. In this article, a novel conversion framework is presented for deep SNNs with negative‐spike dynamics, which takes a quantization constraint and spike compensation technique into consideration during ANN‐to‐SNN conversion, and a truly lossless accuracy performance with their ANN counterparts is obtained. The converted SNNs can retain full advantages of simple leaky‐integrate‐and‐fire spiking neurons and are very suited for hardware implementation. In the experimental results, it is shown that converted spiking LeNet on MNIST/FashionMNIST and VGG‐Net on CIFAR‐10 dataset yield the state‐of‐the‐art classification accuracies with quite shortened computing time steps and much fewer synaptic operations.https://doi.org/10.1002/aisy.202300383artificial neural networknetwork conversionnetwork quantizationspike compensationspiking neural network |
spellingShingle | Chenglong Zou Xiaoxin Cui Guang Chen Yuanyuan Jiang Yuan Wang Toward a Lossless Conversion for Spiking Neural Networks with Negative‐Spike Dynamics Advanced Intelligent Systems artificial neural network network conversion network quantization spike compensation spiking neural network |
title | Toward a Lossless Conversion for Spiking Neural Networks with Negative‐Spike Dynamics |
title_full | Toward a Lossless Conversion for Spiking Neural Networks with Negative‐Spike Dynamics |
title_fullStr | Toward a Lossless Conversion for Spiking Neural Networks with Negative‐Spike Dynamics |
title_full_unstemmed | Toward a Lossless Conversion for Spiking Neural Networks with Negative‐Spike Dynamics |
title_short | Toward a Lossless Conversion for Spiking Neural Networks with Negative‐Spike Dynamics |
title_sort | toward a lossless conversion for spiking neural networks with negative spike dynamics |
topic | artificial neural network network conversion network quantization spike compensation spiking neural network |
url | https://doi.org/10.1002/aisy.202300383 |
work_keys_str_mv | AT chenglongzou towardalosslessconversionforspikingneuralnetworkswithnegativespikedynamics AT xiaoxincui towardalosslessconversionforspikingneuralnetworkswithnegativespikedynamics AT guangchen towardalosslessconversionforspikingneuralnetworkswithnegativespikedynamics AT yuanyuanjiang towardalosslessconversionforspikingneuralnetworkswithnegativespikedynamics AT yuanwang towardalosslessconversionforspikingneuralnetworkswithnegativespikedynamics |