Network Coding Approaches for Distributed Computation over Lossy Wireless Networks

In wireless distributed computing systems, worker nodes connect to a master node wirelessly and perform large-scale computational tasks that are parallelized across them. However, the common phenomenon of straggling (i.e., worker nodes often experience unpredictable slowdown during computation and c...

Full description

Bibliographic Details
Main Authors: Bin Fan, Bin Tang, Zhihao Qu, Baoliu Ye
Format: Article
Language:English
Published: MDPI AG 2023-02-01
Series:Entropy
Subjects:
Online Access:https://www.mdpi.com/1099-4300/25/3/428
_version_ 1797612007903461376
author Bin Fan
Bin Tang
Zhihao Qu
Baoliu Ye
author_facet Bin Fan
Bin Tang
Zhihao Qu
Baoliu Ye
author_sort Bin Fan
collection DOAJ
description In wireless distributed computing systems, worker nodes connect to a master node wirelessly and perform large-scale computational tasks that are parallelized across them. However, the common phenomenon of straggling (i.e., worker nodes often experience unpredictable slowdown during computation and communication) and packet losses due to severe channel fading can significantly increase the latency of computational tasks. In this paper, we consider a heterogeneous, wireless, distributed computing system performing large-scale matrix multiplications which form the core of many machine learning applications. To address the aforementioned challenges, we first propose a random linear network coding (RLNC) approach that leverages the linearity of matrix multiplication, which has many salient properties, including ratelessness, maximum straggler tolerance and near-ideal load balancing. We then theoretically demonstrate that its latency converges to the optimum in probability when the matrix size grows to infinity. To combat the high encoding and decoding overheads of the RLNC approach, we further propose a practical variation based on batched sparse (BATS) code. The effectiveness of our proposed approaches is demonstrated by numerical simulations.
first_indexed 2024-03-11T06:35:23Z
format Article
id doaj.art-ebf01ff3996d4ce4956983811ee49142
institution Directory Open Access Journal
issn 1099-4300
language English
last_indexed 2024-03-11T06:35:23Z
publishDate 2023-02-01
publisher MDPI AG
record_format Article
series Entropy
spelling doaj.art-ebf01ff3996d4ce4956983811ee491422023-11-17T10:56:05ZengMDPI AGEntropy1099-43002023-02-0125342810.3390/e25030428Network Coding Approaches for Distributed Computation over Lossy Wireless NetworksBin Fan0Bin Tang1Zhihao Qu2Baoliu Ye3Key Laboratory of Water Big Data Technology of Ministry of Water Resources, Hohai University, Nanjing 211100, ChinaKey Laboratory of Water Big Data Technology of Ministry of Water Resources, Hohai University, Nanjing 211100, ChinaKey Laboratory of Water Big Data Technology of Ministry of Water Resources, Hohai University, Nanjing 211100, ChinaKey Laboratory of Water Big Data Technology of Ministry of Water Resources, Hohai University, Nanjing 211100, ChinaIn wireless distributed computing systems, worker nodes connect to a master node wirelessly and perform large-scale computational tasks that are parallelized across them. However, the common phenomenon of straggling (i.e., worker nodes often experience unpredictable slowdown during computation and communication) and packet losses due to severe channel fading can significantly increase the latency of computational tasks. In this paper, we consider a heterogeneous, wireless, distributed computing system performing large-scale matrix multiplications which form the core of many machine learning applications. To address the aforementioned challenges, we first propose a random linear network coding (RLNC) approach that leverages the linearity of matrix multiplication, which has many salient properties, including ratelessness, maximum straggler tolerance and near-ideal load balancing. We then theoretically demonstrate that its latency converges to the optimum in probability when the matrix size grows to infinity. To combat the high encoding and decoding overheads of the RLNC approach, we further propose a practical variation based on batched sparse (BATS) code. The effectiveness of our proposed approaches is demonstrated by numerical simulations.https://www.mdpi.com/1099-4300/25/3/428distributed computingcoded computationnetwork codinglossy wireless networkBATS codes
spellingShingle Bin Fan
Bin Tang
Zhihao Qu
Baoliu Ye
Network Coding Approaches for Distributed Computation over Lossy Wireless Networks
Entropy
distributed computing
coded computation
network coding
lossy wireless network
BATS codes
title Network Coding Approaches for Distributed Computation over Lossy Wireless Networks
title_full Network Coding Approaches for Distributed Computation over Lossy Wireless Networks
title_fullStr Network Coding Approaches for Distributed Computation over Lossy Wireless Networks
title_full_unstemmed Network Coding Approaches for Distributed Computation over Lossy Wireless Networks
title_short Network Coding Approaches for Distributed Computation over Lossy Wireless Networks
title_sort network coding approaches for distributed computation over lossy wireless networks
topic distributed computing
coded computation
network coding
lossy wireless network
BATS codes
url https://www.mdpi.com/1099-4300/25/3/428
work_keys_str_mv AT binfan networkcodingapproachesfordistributedcomputationoverlossywirelessnetworks
AT bintang networkcodingapproachesfordistributedcomputationoverlossywirelessnetworks
AT zhihaoqu networkcodingapproachesfordistributedcomputationoverlossywirelessnetworks
AT baoliuye networkcodingapproachesfordistributedcomputationoverlossywirelessnetworks