Direct Gradient Calculation: Simple and Variation‐Tolerant On‐Chip Training Method for Neural Networks
On‐chip training of neural networks (NNs) is regarded as a promising training method for neuromorphic systems with analog synaptic devices. Herein, a novel on‐chip training method called direct gradient calculation (DGC) is proposed to substitute conventional backpropagation (BP). In this method, th...
Main Authors: | Hyungyo Kim, Joon Hwang, Dongseok Kwon, Jangsaeng Kim, Min-Kyu Park, Jiseong Im, Byung-Gook Park, Jong-Ho Lee |
---|---|
Format: | Article |
Language: | English |
Published: |
Wiley
2021-08-01
|
Series: | Advanced Intelligent Systems |
Subjects: | |
Online Access: | https://doi.org/10.1002/aisy.202100064 |
Similar Items
-
On-Chip Trainable Spiking Neural Networks Using Time-To-First-Spike Encoding
by: Jiseong Im, et al.
Published: (2022-01-01) -
Hardware-Based Spiking Neural Network Using a TFT-Type AND Flash Memory Array Architecture Based on Direct Feedback Alignment
by: Won-Mook Kang, et al.
Published: (2021-01-01) -
On-Chip Training Spiking Neural Networks Using Approximated Backpropagation With Analog Synaptic Devices
by: Dongseok Kwon, et al.
Published: (2020-07-01) -
Spike-Train Level Direct Feedback Alignment: Sidestepping Backpropagation for On-Chip Training of Spiking Neural Nets
by: Jeongjun Lee, et al.
Published: (2020-03-01) -
Spiking Neural Networks With Time-to-First-Spike Coding Using TFT-Type Synaptic Device Model
by: Seongbin Oh, et al.
Published: (2021-01-01)