Speculative Backpropagation for CNN Parallel Training

The parallel learning in neural networks can greatly shorten the training time. Its prior efforts were mostly limited to distributing inputs to multiple computing engines. It is because the gradient descent algorithm in the neural network training is inherently sequential. This paper proposes a nove...

Full description

Bibliographic Details
Main Authors: Sangwoo Park, Taeweon Suh
Format: Article
Language:English
Published: IEEE 2020-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9272337/