Gradient Decomposition Methods for Training Neural Networks With Non-ideal Synaptic Devices
While promising for high-capacity machine learning accelerators, memristor devices have non-idealities that prevent software-equivalent accuracies when used for online training. This work uses a combination of Mini-Batch Gradient Descent (MBGD) to average gradients, stochastic rounding to avoid vani...
Main Authors: | Junyun Zhao, Siyuan Huang, Osama Yousuf, Yutong Gao, Brian D. Hoskins, Gina C. Adam |
---|---|
Format: | Article |
Language: | English |
Published: |
Frontiers Media S.A.
2021-11-01
|
Series: | Frontiers in Neuroscience |
Subjects: | |
Online Access: | https://www.frontiersin.org/articles/10.3389/fnins.2021.749811/full |
Similar Items
-
Exploiting device-level non-idealities for adversarial attacks on ReRAM-based neural networks
by: Tyler McLemore, et al.
Published: (2023-07-01) -
Direct Visualization of Charge Migration in Bilayer Tantalum Oxide Films by Multimodal Imaging
by: Matthew Flynn‐Hepford, et al.
Published: (2024-01-01) -
Neuromorphic Computing Using Emerging Synaptic Devices: A Retrospective Summary and an Outlook
by: Jaeyoung Park
Published: (2020-09-01) -
Crossbar-constrained technology mapping for ReRAM based in-memory computing
by: Bhattacharjee, Debjyoti, et al.
Published: (2021) -
Tailor-made synaptic dynamics based on memristive devices
by: Christopher Bengel, et al.
Published: (2023-01-01)