Efficient Synapse Memory Structure for Reconfigurable Digital Neuromorphic Hardware

Spiking Neural Networks (SNNs) have high potential to process information efficiently with binary spikes and time delay information. Recently, dedicated SNN hardware accelerators with on-chip synapse memory array are gaining interest in overcoming the limitations of running software-based SNN in con...

Full description

Bibliographic Details
Main Authors: Jinseok Kim, Jongeun Koo, Taesu Kim, Jae-Joon Kim
Format: Article
Language:English
Published: Frontiers Media S.A. 2018-11-01
Series:Frontiers in Neuroscience
Subjects:
Online Access:https://www.frontiersin.org/article/10.3389/fnins.2018.00829/full
_version_ 1817982416875159552
author Jinseok Kim
Jongeun Koo
Taesu Kim
Jae-Joon Kim
Jae-Joon Kim
author_facet Jinseok Kim
Jongeun Koo
Taesu Kim
Jae-Joon Kim
Jae-Joon Kim
author_sort Jinseok Kim
collection DOAJ
description Spiking Neural Networks (SNNs) have high potential to process information efficiently with binary spikes and time delay information. Recently, dedicated SNN hardware accelerators with on-chip synapse memory array are gaining interest in overcoming the limitations of running software-based SNN in conventional Von Neumann machines. In this paper, we proposed an efficient synapse memory structure to reduce the amount of hardware resource usage while maintaining performance and network size. In the proposed design, synapse memory size can be reduced by applying presynaptic weight scaling. In addition, axonal/neuronal offsets are applied to implement multiple layers on a single memory array. Finally, a transposable memory addressing scheme is presented for faster operation of spike-timing-dependent plasticity (STDP) learning. We implemented a SNN ASIC chip based on the proposed scheme with 65 nm CMOS technology. Chip measurement results showed that the proposed design provided up to 200X speedup over CPU while consuming 53 mW at 100 MHz with the energy efficiency of 15.2 pJ/SOP.
first_indexed 2024-04-13T23:21:33Z
format Article
id doaj.art-5d28810718c143fdaad59dc29ed4a7ef
institution Directory Open Access Journal
issn 1662-453X
language English
last_indexed 2024-04-13T23:21:33Z
publishDate 2018-11-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Neuroscience
spelling doaj.art-5d28810718c143fdaad59dc29ed4a7ef2022-12-22T02:25:14ZengFrontiers Media S.A.Frontiers in Neuroscience1662-453X2018-11-011210.3389/fnins.2018.00829416978Efficient Synapse Memory Structure for Reconfigurable Digital Neuromorphic HardwareJinseok Kim0Jongeun Koo1Taesu Kim2Jae-Joon Kim3Jae-Joon Kim4Department of Creative IT Engineering, Pohang University of Science and Technology, Pohang, South KoreaDepartment of Electrical Engineering, Pohang University of Science and Technology, Pohang, South KoreaDepartment of Creative IT Engineering, Pohang University of Science and Technology, Pohang, South KoreaDepartment of Creative IT Engineering, Pohang University of Science and Technology, Pohang, South KoreaDepartment of Electrical Engineering, Pohang University of Science and Technology, Pohang, South KoreaSpiking Neural Networks (SNNs) have high potential to process information efficiently with binary spikes and time delay information. Recently, dedicated SNN hardware accelerators with on-chip synapse memory array are gaining interest in overcoming the limitations of running software-based SNN in conventional Von Neumann machines. In this paper, we proposed an efficient synapse memory structure to reduce the amount of hardware resource usage while maintaining performance and network size. In the proposed design, synapse memory size can be reduced by applying presynaptic weight scaling. In addition, axonal/neuronal offsets are applied to implement multiple layers on a single memory array. Finally, a transposable memory addressing scheme is presented for faster operation of spike-timing-dependent plasticity (STDP) learning. We implemented a SNN ASIC chip based on the proposed scheme with 65 nm CMOS technology. Chip measurement results showed that the proposed design provided up to 200X speedup over CPU while consuming 53 mW at 100 MHz with the energy efficiency of 15.2 pJ/SOP.https://www.frontiersin.org/article/10.3389/fnins.2018.00829/fullneuromorphic systemspiking neural networkspike-timing-dependent plasticityon-chip learningtransposable memory
spellingShingle Jinseok Kim
Jongeun Koo
Taesu Kim
Jae-Joon Kim
Jae-Joon Kim
Efficient Synapse Memory Structure for Reconfigurable Digital Neuromorphic Hardware
Frontiers in Neuroscience
neuromorphic system
spiking neural network
spike-timing-dependent plasticity
on-chip learning
transposable memory
title Efficient Synapse Memory Structure for Reconfigurable Digital Neuromorphic Hardware
title_full Efficient Synapse Memory Structure for Reconfigurable Digital Neuromorphic Hardware
title_fullStr Efficient Synapse Memory Structure for Reconfigurable Digital Neuromorphic Hardware
title_full_unstemmed Efficient Synapse Memory Structure for Reconfigurable Digital Neuromorphic Hardware
title_short Efficient Synapse Memory Structure for Reconfigurable Digital Neuromorphic Hardware
title_sort efficient synapse memory structure for reconfigurable digital neuromorphic hardware
topic neuromorphic system
spiking neural network
spike-timing-dependent plasticity
on-chip learning
transposable memory
url https://www.frontiersin.org/article/10.3389/fnins.2018.00829/full
work_keys_str_mv AT jinseokkim efficientsynapsememorystructureforreconfigurabledigitalneuromorphichardware
AT jongeunkoo efficientsynapsememorystructureforreconfigurabledigitalneuromorphichardware
AT taesukim efficientsynapsememorystructureforreconfigurabledigitalneuromorphichardware
AT jaejoonkim efficientsynapsememorystructureforreconfigurabledigitalneuromorphichardware
AT jaejoonkim efficientsynapsememorystructureforreconfigurabledigitalneuromorphichardware