Communication-Efficient Distributed SGD with Error-Feedback, Revisited
We show that the convergence proof of a recent algorithm called dist-EF-SGD for distributed stochastic gradient descent with communication efficiency using error-feedback of Zheng et al., Communication-efficient distributed blockwise momentum SGD with error-feedback, in Advances in Neural Informatio...
Main Authors: | Tran Thi Phuong, Le Trieu Phong |
---|---|
Format: | Article |
Language: | English |
Published: |
Springer
2021-04-01
|
Series: | International Journal of Computational Intelligence Systems |
Subjects: | |
Online Access: | https://www.atlantis-press.com/article/125955624/view |
Similar Items
-
Distributed SGD With Flexible Gradient Compression
by: Tran Thi Phuong, et al.
Published: (2020-01-01) -
Distributed SignSGD With Improved Accuracy and Network-Fault Tolerance
by: Trieu Le Phong, et al.
Published: (2020-01-01) -
ZenoPS: A Distributed Learning System Integrating Communication Efficiency and Security
by: Cong Xie, et al.
Published: (2022-07-01) -
Trend-Smooth: Accelerate Asynchronous SGD by Smoothing Parameters Using Parameter Trends
by: Guoxin Cui, et al.
Published: (2019-01-01) -
Analisis Performa Algoritma Stochastic Gradient Descent (SGD) Dalam Mengklasifikasi Tahu Berformalin
by: Fadhila Tangguh Admojo, et al.
Published: (2022-03-01)