Mobility-Aware Federated Learning Considering Multiple Networks

Federated learning (<i>FL</i>) is a distributed training method for machine learning models (<i>ML</i>) that maintain data ownership on users. However, this distributed training approach can lead to variations in efficiency due to user behaviors or characteristics. For instan...

Full description

Bibliographic Details
Main Authors: Daniel Macedo, Danilo Santos, Angelo Perkusich, Dalton C. G. Valadares
Format: Article
Language:English
Published: MDPI AG 2023-07-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/23/14/6286
Description
Summary:Federated learning (<i>FL</i>) is a distributed training method for machine learning models (<i>ML</i>) that maintain data ownership on users. However, this distributed training approach can lead to variations in efficiency due to user behaviors or characteristics. For instance, mobility can hinder training by causing a client dropout when a device loses connection with other devices on the network. To address this issue, we propose a <i>FL</i> coordination algorithm, <i>MoFeL</i>, to ensure efficient training even in scenarios with mobility. Furthermore, <i>MoFeL</i> evaluates multiple networks with different central servers. To evaluate its effectiveness, we conducted simulation experiments using an image classification application that utilizes machine models trained by a convolutional neural network. The simulation results demonstrate that <i>MoFeL</i> outperforms traditional training coordination algorithms in <i>FL</i>, with <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mn>156.5</mn><mo>%</mo></mrow></semantics></math></inline-formula> more training cycles, in scenarios with high mobility compared to an algorithm that does not consider mobility aspects.
ISSN:1424-8220