An FPGA implementation of Bayesian inference with spiking neural networks

Spiking neural networks (SNNs), as brain-inspired neural network models based on spikes, have the advantage of processing information with low complexity and efficient energy consumption. Currently, there is a growing trend to design hardware accelerators for dedicated SNNs to overcome the limitatio...

詳細記述

書誌詳細
主要な著者: Haoran Li, Bo Wan, Ying Fang, Qifeng Li, Jian K. Liu, Lingling An
フォーマット: 論文
言語:English
出版事項: Frontiers Media S.A. 2024-01-01
シリーズ:Frontiers in Neuroscience
主題:
オンライン・アクセス:https://www.frontiersin.org/articles/10.3389/fnins.2023.1291051/full
_version_ 1827390428293365760
author Haoran Li
Bo Wan
Bo Wan
Ying Fang
Ying Fang
Qifeng Li
Jian K. Liu
Lingling An
Lingling An
author_facet Haoran Li
Bo Wan
Bo Wan
Ying Fang
Ying Fang
Qifeng Li
Jian K. Liu
Lingling An
Lingling An
author_sort Haoran Li
collection DOAJ
description Spiking neural networks (SNNs), as brain-inspired neural network models based on spikes, have the advantage of processing information with low complexity and efficient energy consumption. Currently, there is a growing trend to design hardware accelerators for dedicated SNNs to overcome the limitation of running under the traditional von Neumann architecture. Probabilistic sampling is an effective modeling approach for implementing SNNs to simulate the brain to achieve Bayesian inference. However, sampling consumes considerable time. It is highly demanding for specific hardware implementation of SNN sampling models to accelerate inference operations. Hereby, we design a hardware accelerator based on FPGA to speed up the execution of SNN algorithms by parallelization. We use streaming pipelining and array partitioning operations to achieve model operation acceleration with the least possible resource consumption, and combine the Python productivity for Zynq (PYNQ) framework to implement the model migration to the FPGA while increasing the speed of model operations. We verify the functionality and performance of the hardware architecture on the Xilinx Zynq ZCU104. The experimental results show that the hardware accelerator of the SNN sampling model proposed can significantly improve the computing speed while ensuring the accuracy of inference. In addition, Bayesian inference for spiking neural networks through the PYNQ framework can fully optimize the high performance and low power consumption of FPGAs in embedded applications. Taken together, our proposed FPGA implementation of Bayesian inference with SNNs has great potential for a wide range of applications, it can be ideal for implementing complex probabilistic model inference in embedded systems.
first_indexed 2024-03-08T16:52:33Z
format Article
id doaj.art-f8bd6eda8f2e42b4ade9c00cd9f16b2a
institution Directory Open Access Journal
issn 1662-453X
language English
last_indexed 2024-03-08T16:52:33Z
publishDate 2024-01-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Neuroscience
spelling doaj.art-f8bd6eda8f2e42b4ade9c00cd9f16b2a2024-01-05T04:17:40ZengFrontiers Media S.A.Frontiers in Neuroscience1662-453X2024-01-011710.3389/fnins.2023.12910511291051An FPGA implementation of Bayesian inference with spiking neural networksHaoran Li0Bo Wan1Bo Wan2Ying Fang3Ying Fang4Qifeng Li5Jian K. Liu6Lingling An7Lingling An8Guangzhou Institute of Technology, Xidian University, Guangzhou, ChinaSchool of Computer Science and Technology, Xidian University, Xi'an, ChinaKey Laboratory of Smart Human Computer Interaction and Wearable Technology of Shaanxi Province, Xi'an, ChinaCollege of Computer and Cyber Security, Fujian Normal University, Fuzhou, ChinaDigital Fujian Internet-of-Thing Laboratory of Environmental Monitoring, Fujian Normal University, Fuzhou, ChinaResearch Center of Information Technology, Beijing Academy of Agriculture and Forestry Sciences, National Engineering Research Center for Information Technology in Agriculture, Beijing, ChinaSchool of Computer Science, University of Birmingham, Birmingham, United KingdomGuangzhou Institute of Technology, Xidian University, Guangzhou, ChinaSchool of Computer Science and Technology, Xidian University, Xi'an, ChinaSpiking neural networks (SNNs), as brain-inspired neural network models based on spikes, have the advantage of processing information with low complexity and efficient energy consumption. Currently, there is a growing trend to design hardware accelerators for dedicated SNNs to overcome the limitation of running under the traditional von Neumann architecture. Probabilistic sampling is an effective modeling approach for implementing SNNs to simulate the brain to achieve Bayesian inference. However, sampling consumes considerable time. It is highly demanding for specific hardware implementation of SNN sampling models to accelerate inference operations. Hereby, we design a hardware accelerator based on FPGA to speed up the execution of SNN algorithms by parallelization. We use streaming pipelining and array partitioning operations to achieve model operation acceleration with the least possible resource consumption, and combine the Python productivity for Zynq (PYNQ) framework to implement the model migration to the FPGA while increasing the speed of model operations. We verify the functionality and performance of the hardware architecture on the Xilinx Zynq ZCU104. The experimental results show that the hardware accelerator of the SNN sampling model proposed can significantly improve the computing speed while ensuring the accuracy of inference. In addition, Bayesian inference for spiking neural networks through the PYNQ framework can fully optimize the high performance and low power consumption of FPGAs in embedded applications. Taken together, our proposed FPGA implementation of Bayesian inference with SNNs has great potential for a wide range of applications, it can be ideal for implementing complex probabilistic model inference in embedded systems.https://www.frontiersin.org/articles/10.3389/fnins.2023.1291051/fullspiking neural networksprobabilistic graphical modelsBayesian inferenceimportance samplingFPGA
spellingShingle Haoran Li
Bo Wan
Bo Wan
Ying Fang
Ying Fang
Qifeng Li
Jian K. Liu
Lingling An
Lingling An
An FPGA implementation of Bayesian inference with spiking neural networks
Frontiers in Neuroscience
spiking neural networks
probabilistic graphical models
Bayesian inference
importance sampling
FPGA
title An FPGA implementation of Bayesian inference with spiking neural networks
title_full An FPGA implementation of Bayesian inference with spiking neural networks
title_fullStr An FPGA implementation of Bayesian inference with spiking neural networks
title_full_unstemmed An FPGA implementation of Bayesian inference with spiking neural networks
title_short An FPGA implementation of Bayesian inference with spiking neural networks
title_sort fpga implementation of bayesian inference with spiking neural networks
topic spiking neural networks
probabilistic graphical models
Bayesian inference
importance sampling
FPGA
url https://www.frontiersin.org/articles/10.3389/fnins.2023.1291051/full
work_keys_str_mv AT haoranli anfpgaimplementationofbayesianinferencewithspikingneuralnetworks
AT bowan anfpgaimplementationofbayesianinferencewithspikingneuralnetworks
AT bowan anfpgaimplementationofbayesianinferencewithspikingneuralnetworks
AT yingfang anfpgaimplementationofbayesianinferencewithspikingneuralnetworks
AT yingfang anfpgaimplementationofbayesianinferencewithspikingneuralnetworks
AT qifengli anfpgaimplementationofbayesianinferencewithspikingneuralnetworks
AT jiankliu anfpgaimplementationofbayesianinferencewithspikingneuralnetworks
AT linglingan anfpgaimplementationofbayesianinferencewithspikingneuralnetworks
AT linglingan anfpgaimplementationofbayesianinferencewithspikingneuralnetworks
AT haoranli fpgaimplementationofbayesianinferencewithspikingneuralnetworks
AT bowan fpgaimplementationofbayesianinferencewithspikingneuralnetworks
AT bowan fpgaimplementationofbayesianinferencewithspikingneuralnetworks
AT yingfang fpgaimplementationofbayesianinferencewithspikingneuralnetworks
AT yingfang fpgaimplementationofbayesianinferencewithspikingneuralnetworks
AT qifengli fpgaimplementationofbayesianinferencewithspikingneuralnetworks
AT jiankliu fpgaimplementationofbayesianinferencewithspikingneuralnetworks
AT linglingan fpgaimplementationofbayesianinferencewithspikingneuralnetworks
AT linglingan fpgaimplementationofbayesianinferencewithspikingneuralnetworks