Distributed Stochastic Gradient Descent With Compressed and Skipped Communication

This paper introduces CompSkipDSGD, a new algorithm for distributed stochastic gradient descent that aims to improve communication efficiency by compressing and selectively skipping communication. In addition to compression, CompSkipDSGD allows both workers and the server to skip communication in an...

Full description

Bibliographic Details
Main Authors: Tran Thi Phuong, Le Trieu Phong, Kazuhide Fukushima
Format: Article
Language:English
Published: IEEE 2023-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10251499/