IDSNN: Towards High-Performance and Low-Latency SNN Training via Initialization and Distillation
Spiking neural networks (SNNs) are widely recognized for their biomimetic and efficient computing features. They utilize spikes to encode and transmit information. Despite the many advantages of SNNs, they suffer from the problems of low accuracy and large inference latency, which are, respectively,...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2023-08-01
|
Series: | Biomimetics |
Subjects: | |
Online Access: | https://www.mdpi.com/2313-7673/8/4/375 |
_version_ | 1797585373960863744 |
---|---|
author | Xiongfei Fan Hong Zhang Yu Zhang |
author_facet | Xiongfei Fan Hong Zhang Yu Zhang |
author_sort | Xiongfei Fan |
collection | DOAJ |
description | Spiking neural networks (SNNs) are widely recognized for their biomimetic and efficient computing features. They utilize spikes to encode and transmit information. Despite the many advantages of SNNs, they suffer from the problems of low accuracy and large inference latency, which are, respectively, caused by the direct training and conversion from artificial neural network (ANN) training methods. Aiming to address these limitations, we propose a novel training pipeline (called IDSNN) based on parameter initialization and knowledge distillation, using ANN as a parameter source and teacher. IDSNN maximizes the knowledge extracted from ANNs and achieves competitive top-1 accuracy for CIFAR10 (<inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mn>94.22</mn><mo>%</mo></mrow></semantics></math></inline-formula>) and CIFAR100 (<inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mn>75.41</mn><mo>%</mo></mrow></semantics></math></inline-formula>) with low latency. More importantly, it can achieve <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mn>14</mn><mo>×</mo></mrow></semantics></math></inline-formula> faster convergence speed than directly training SNNs under limited training resources, which demonstrates its practical value in applications. |
first_indexed | 2024-03-11T00:05:13Z |
format | Article |
id | doaj.art-3004660b14b84037b99f3d5f4e39f5a6 |
institution | Directory Open Access Journal |
issn | 2313-7673 |
language | English |
last_indexed | 2024-03-11T00:05:13Z |
publishDate | 2023-08-01 |
publisher | MDPI AG |
record_format | Article |
series | Biomimetics |
spelling | doaj.art-3004660b14b84037b99f3d5f4e39f5a62023-11-19T00:22:59ZengMDPI AGBiomimetics2313-76732023-08-018437510.3390/biomimetics8040375IDSNN: Towards High-Performance and Low-Latency SNN Training via Initialization and DistillationXiongfei Fan0Hong Zhang1Yu Zhang2State Key Laboratory of Industrial Control Technology, College of Control Science and Engineering, Zhejiang University, Hangzhou 310027, ChinaState Key Laboratory of Industrial Control Technology, College of Control Science and Engineering, Zhejiang University, Hangzhou 310027, ChinaState Key Laboratory of Industrial Control Technology, College of Control Science and Engineering, Zhejiang University, Hangzhou 310027, ChinaSpiking neural networks (SNNs) are widely recognized for their biomimetic and efficient computing features. They utilize spikes to encode and transmit information. Despite the many advantages of SNNs, they suffer from the problems of low accuracy and large inference latency, which are, respectively, caused by the direct training and conversion from artificial neural network (ANN) training methods. Aiming to address these limitations, we propose a novel training pipeline (called IDSNN) based on parameter initialization and knowledge distillation, using ANN as a parameter source and teacher. IDSNN maximizes the knowledge extracted from ANNs and achieves competitive top-1 accuracy for CIFAR10 (<inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mn>94.22</mn><mo>%</mo></mrow></semantics></math></inline-formula>) and CIFAR100 (<inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mn>75.41</mn><mo>%</mo></mrow></semantics></math></inline-formula>) with low latency. More importantly, it can achieve <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mn>14</mn><mo>×</mo></mrow></semantics></math></inline-formula> faster convergence speed than directly training SNNs under limited training resources, which demonstrates its practical value in applications.https://www.mdpi.com/2313-7673/8/4/375spiking neural networks (SNNs)knowledge distillationinitializationimage classification |
spellingShingle | Xiongfei Fan Hong Zhang Yu Zhang IDSNN: Towards High-Performance and Low-Latency SNN Training via Initialization and Distillation Biomimetics spiking neural networks (SNNs) knowledge distillation initialization image classification |
title | IDSNN: Towards High-Performance and Low-Latency SNN Training via Initialization and Distillation |
title_full | IDSNN: Towards High-Performance and Low-Latency SNN Training via Initialization and Distillation |
title_fullStr | IDSNN: Towards High-Performance and Low-Latency SNN Training via Initialization and Distillation |
title_full_unstemmed | IDSNN: Towards High-Performance and Low-Latency SNN Training via Initialization and Distillation |
title_short | IDSNN: Towards High-Performance and Low-Latency SNN Training via Initialization and Distillation |
title_sort | idsnn towards high performance and low latency snn training via initialization and distillation |
topic | spiking neural networks (SNNs) knowledge distillation initialization image classification |
url | https://www.mdpi.com/2313-7673/8/4/375 |
work_keys_str_mv | AT xiongfeifan idsnntowardshighperformanceandlowlatencysnntrainingviainitializationanddistillation AT hongzhang idsnntowardshighperformanceandlowlatencysnntrainingviainitializationanddistillation AT yuzhang idsnntowardshighperformanceandlowlatencysnntrainingviainitializationanddistillation |