Mobility-Aware Federated Learning Considering Multiple Networks
Federated learning (<i>FL</i>) is a distributed training method for machine learning models (<i>ML</i>) that maintain data ownership on users. However, this distributed training approach can lead to variations in efficiency due to user behaviors or characteristics. For instan...
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2023-07-01
|
Series: | Sensors |
Subjects: | |
Online Access: | https://www.mdpi.com/1424-8220/23/14/6286 |
_version_ | 1797587535048736768 |
---|---|
author | Daniel Macedo Danilo Santos Angelo Perkusich Dalton C. G. Valadares |
author_facet | Daniel Macedo Danilo Santos Angelo Perkusich Dalton C. G. Valadares |
author_sort | Daniel Macedo |
collection | DOAJ |
description | Federated learning (<i>FL</i>) is a distributed training method for machine learning models (<i>ML</i>) that maintain data ownership on users. However, this distributed training approach can lead to variations in efficiency due to user behaviors or characteristics. For instance, mobility can hinder training by causing a client dropout when a device loses connection with other devices on the network. To address this issue, we propose a <i>FL</i> coordination algorithm, <i>MoFeL</i>, to ensure efficient training even in scenarios with mobility. Furthermore, <i>MoFeL</i> evaluates multiple networks with different central servers. To evaluate its effectiveness, we conducted simulation experiments using an image classification application that utilizes machine models trained by a convolutional neural network. The simulation results demonstrate that <i>MoFeL</i> outperforms traditional training coordination algorithms in <i>FL</i>, with <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mn>156.5</mn><mo>%</mo></mrow></semantics></math></inline-formula> more training cycles, in scenarios with high mobility compared to an algorithm that does not consider mobility aspects. |
first_indexed | 2024-03-11T00:41:12Z |
format | Article |
id | doaj.art-960e8b419af44af3aa9ba2615f955c91 |
institution | Directory Open Access Journal |
issn | 1424-8220 |
language | English |
last_indexed | 2024-03-11T00:41:12Z |
publishDate | 2023-07-01 |
publisher | MDPI AG |
record_format | Article |
series | Sensors |
spelling | doaj.art-960e8b419af44af3aa9ba2615f955c912023-11-18T21:15:37ZengMDPI AGSensors1424-82202023-07-012314628610.3390/s23146286Mobility-Aware Federated Learning Considering Multiple NetworksDaniel Macedo0Danilo Santos1Angelo Perkusich2Dalton C. G. Valadares3Department of Electrical Engineering, Federal University of Campina Grande, Campina Grande 58429-900, Paraiba, BrazilVirtus RDI Center, Federal University of Campina Grande, Campina Grande 58429-900, Paraiba, BrazilVirtus RDI Center, Federal University of Campina Grande, Campina Grande 58429-900, Paraiba, BrazilDepartment of Electrical Engineering, Federal University of Campina Grande, Campina Grande 58429-900, Paraiba, BrazilFederated learning (<i>FL</i>) is a distributed training method for machine learning models (<i>ML</i>) that maintain data ownership on users. However, this distributed training approach can lead to variations in efficiency due to user behaviors or characteristics. For instance, mobility can hinder training by causing a client dropout when a device loses connection with other devices on the network. To address this issue, we propose a <i>FL</i> coordination algorithm, <i>MoFeL</i>, to ensure efficient training even in scenarios with mobility. Furthermore, <i>MoFeL</i> evaluates multiple networks with different central servers. To evaluate its effectiveness, we conducted simulation experiments using an image classification application that utilizes machine models trained by a convolutional neural network. The simulation results demonstrate that <i>MoFeL</i> outperforms traditional training coordination algorithms in <i>FL</i>, with <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mn>156.5</mn><mo>%</mo></mrow></semantics></math></inline-formula> more training cycles, in scenarios with high mobility compared to an algorithm that does not consider mobility aspects.https://www.mdpi.com/1424-8220/23/14/6286machine learningdistributed learningfederated learningmobility |
spellingShingle | Daniel Macedo Danilo Santos Angelo Perkusich Dalton C. G. Valadares Mobility-Aware Federated Learning Considering Multiple Networks Sensors machine learning distributed learning federated learning mobility |
title | Mobility-Aware Federated Learning Considering Multiple Networks |
title_full | Mobility-Aware Federated Learning Considering Multiple Networks |
title_fullStr | Mobility-Aware Federated Learning Considering Multiple Networks |
title_full_unstemmed | Mobility-Aware Federated Learning Considering Multiple Networks |
title_short | Mobility-Aware Federated Learning Considering Multiple Networks |
title_sort | mobility aware federated learning considering multiple networks |
topic | machine learning distributed learning federated learning mobility |
url | https://www.mdpi.com/1424-8220/23/14/6286 |
work_keys_str_mv | AT danielmacedo mobilityawarefederatedlearningconsideringmultiplenetworks AT danilosantos mobilityawarefederatedlearningconsideringmultiplenetworks AT angeloperkusich mobilityawarefederatedlearningconsideringmultiplenetworks AT daltoncgvaladares mobilityawarefederatedlearningconsideringmultiplenetworks |