A Graph Neural Network Based Decentralized Learning Scheme

As an emerging paradigm considering data privacy and transmission efficiency, decentralized learning aims to acquire a global model using the training data distributed over many user devices. It is a challenging problem since link loss, partial device participation, and non-independent and identical...

Full description

Bibliographic Details
Main Authors: Huiguo Gao, Mengyuan Lee, Guanding Yu, Zhaolin Zhou
Format: Article
Language:English
Published: MDPI AG 2022-01-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/22/3/1030
_version_ 1797484654579679232
author Huiguo Gao
Mengyuan Lee
Guanding Yu
Zhaolin Zhou
author_facet Huiguo Gao
Mengyuan Lee
Guanding Yu
Zhaolin Zhou
author_sort Huiguo Gao
collection DOAJ
description As an emerging paradigm considering data privacy and transmission efficiency, decentralized learning aims to acquire a global model using the training data distributed over many user devices. It is a challenging problem since link loss, partial device participation, and non-independent and identically distributed (non-iid) data distribution would all deteriorate the performance of decentralized learning algorithms. Existing work may restrict to linear models or show poor performance over non-iid data. Therefore, in this paper, we propose a decentralized learning scheme based on distributed parallel stochastic gradient descent (DPSGD) and graph neural network (GNN) to deal with the above challenges. Specifically, each user device participating in the learning task utilizes local training data to compute local stochastic gradients and updates its own local model. Then, each device utilizes the GNN model and exchanges the model parameters with its neighbors to reach the average of resultant global models. The iteration repeats until the algorithm converges. Extensive simulation results over both iid and non-iid data validate the algorithm’s convergence to near optimal results and robustness to both link loss and partial device participation.
first_indexed 2024-03-09T23:08:30Z
format Article
id doaj.art-185a5d331a9f446d98aa453f4cae2b28
institution Directory Open Access Journal
issn 1424-8220
language English
last_indexed 2024-03-09T23:08:30Z
publishDate 2022-01-01
publisher MDPI AG
record_format Article
series Sensors
spelling doaj.art-185a5d331a9f446d98aa453f4cae2b282023-11-23T17:49:23ZengMDPI AGSensors1424-82202022-01-01223103010.3390/s22031030A Graph Neural Network Based Decentralized Learning SchemeHuiguo Gao0Mengyuan Lee1Guanding Yu2Zhaolin Zhou3College of Information Science and Electronic Engineering, Zhejiang University, Hangzhou 310027, ChinaCollege of Information Science and Electronic Engineering, Zhejiang University, Hangzhou 310027, ChinaCollege of Information Science and Electronic Engineering, Zhejiang University, Hangzhou 310027, ChinaCollege of Control Science and Engineering, Zhejiang University, Hangzhou 310027, ChinaAs an emerging paradigm considering data privacy and transmission efficiency, decentralized learning aims to acquire a global model using the training data distributed over many user devices. It is a challenging problem since link loss, partial device participation, and non-independent and identically distributed (non-iid) data distribution would all deteriorate the performance of decentralized learning algorithms. Existing work may restrict to linear models or show poor performance over non-iid data. Therefore, in this paper, we propose a decentralized learning scheme based on distributed parallel stochastic gradient descent (DPSGD) and graph neural network (GNN) to deal with the above challenges. Specifically, each user device participating in the learning task utilizes local training data to compute local stochastic gradients and updates its own local model. Then, each device utilizes the GNN model and exchanges the model parameters with its neighbors to reach the average of resultant global models. The iteration repeats until the algorithm converges. Extensive simulation results over both iid and non-iid data validate the algorithm’s convergence to near optimal results and robustness to both link loss and partial device participation.https://www.mdpi.com/1424-8220/22/3/1030decentralized learninggraph neural networkaverage consensus
spellingShingle Huiguo Gao
Mengyuan Lee
Guanding Yu
Zhaolin Zhou
A Graph Neural Network Based Decentralized Learning Scheme
Sensors
decentralized learning
graph neural network
average consensus
title A Graph Neural Network Based Decentralized Learning Scheme
title_full A Graph Neural Network Based Decentralized Learning Scheme
title_fullStr A Graph Neural Network Based Decentralized Learning Scheme
title_full_unstemmed A Graph Neural Network Based Decentralized Learning Scheme
title_short A Graph Neural Network Based Decentralized Learning Scheme
title_sort graph neural network based decentralized learning scheme
topic decentralized learning
graph neural network
average consensus
url https://www.mdpi.com/1424-8220/22/3/1030
work_keys_str_mv AT huiguogao agraphneuralnetworkbaseddecentralizedlearningscheme
AT mengyuanlee agraphneuralnetworkbaseddecentralizedlearningscheme
AT guandingyu agraphneuralnetworkbaseddecentralizedlearningscheme
AT zhaolinzhou agraphneuralnetworkbaseddecentralizedlearningscheme
AT huiguogao graphneuralnetworkbaseddecentralizedlearningscheme
AT mengyuanlee graphneuralnetworkbaseddecentralizedlearningscheme
AT guandingyu graphneuralnetworkbaseddecentralizedlearningscheme
AT zhaolinzhou graphneuralnetworkbaseddecentralizedlearningscheme