A parallel and distributed stochastic gradient descent implementation using commodity clusters
Abstract Deep Learning is an increasingly important subdomain of artificial intelligence, which benefits from training on Big Data. The size and complexity of the model combined with the size of the training dataset makes the training process very computationally and temporally expensive. Accelerati...
Main Authors: | Robert K. L. Kennedy, Taghi M. Khoshgoftaar, Flavio Villanustre, Timothy Humphrey |
---|---|
Format: | Article |
Language: | English |
Published: |
SpringerOpen
2019-02-01
|
Series: | Journal of Big Data |
Subjects: | |
Online Access: | http://link.springer.com/article/10.1186/s40537-019-0179-2 |
Similar Items
-
Adaptive Gradient Estimation Stochastic Parallel Gradient Descent Algorithm for Laser Beam Cleanup
by: Shiqing Ma, et al.
Published: (2021-05-01) -
Pipelined Stochastic Gradient Descent with Taylor Expansion
by: Bongwon Jang, et al.
Published: (2023-10-01) -
AG-SGD: Angle-Based Stochastic Gradient Descent
by: Chongya Song, et al.
Published: (2021-01-01) -
Damped Newton Stochastic Gradient Descent Method for Neural Networks Training
by: Jingcheng Zhou, et al.
Published: (2021-06-01) -
Recent Advances in Stochastic Gradient Descent in Deep Learning
by: Yingjie Tian, et al.
Published: (2023-01-01)