Federated Learning with Dynamic Model Exchange
Large amounts of data are needed to train accurate robust machine learning models, but the acquisition of these data is complicated due to strict regulations. While many business sectors often have unused data silos, researchers face the problem of not being able to obtain a large amount of real-wor...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2022-05-01
|
Series: | Electronics |
Subjects: | |
Online Access: | https://www.mdpi.com/2079-9292/11/10/1530 |
_version_ | 1797500310075211776 |
---|---|
author | Hannes Hilberger Sten Hanke Markus Bödenler |
author_facet | Hannes Hilberger Sten Hanke Markus Bödenler |
author_sort | Hannes Hilberger |
collection | DOAJ |
description | Large amounts of data are needed to train accurate robust machine learning models, but the acquisition of these data is complicated due to strict regulations. While many business sectors often have unused data silos, researchers face the problem of not being able to obtain a large amount of real-world data. This is especially true in the healthcare sector, since transferring these data is often associated with bureaucratic overhead because of, for example, increased security requirements and privacy laws. Federated Learning should circumvent this problem and allow training to take place directly on the data owner’s side without sending them to a central location such as a server. Currently, there exist several frameworks for this purpose such as TensorFlow Federated, Flower, or PySyft/PyGrid. These frameworks define models for both the server and client since the coordination of the training is performed by a server. Here, we present a practical method that contains a dynamic exchange of the model, so that the model is not statically stored in source code. During this process, the model architecture and training configuration are defined by the researchers and sent to the server, which passes the settings to the clients. In addition, the model is transformed by the data owner to incorporate Differential Privacy. To trace a comparison between central learning and the impact of Differential Privacy, performance and security evaluation experiments were conducted. It was found that Federated Learning can achieve results on par with centralised learning and that the use of Differential Privacy can improve the robustness of the model against Membership Inference Attacks in an honest-but-curious setting. |
first_indexed | 2024-03-10T03:00:03Z |
format | Article |
id | doaj.art-7ec1b1d2114843a8801f7af51e90a34a |
institution | Directory Open Access Journal |
issn | 2079-9292 |
language | English |
last_indexed | 2024-03-10T03:00:03Z |
publishDate | 2022-05-01 |
publisher | MDPI AG |
record_format | Article |
series | Electronics |
spelling | doaj.art-7ec1b1d2114843a8801f7af51e90a34a2023-11-23T10:46:30ZengMDPI AGElectronics2079-92922022-05-011110153010.3390/electronics11101530Federated Learning with Dynamic Model ExchangeHannes Hilberger0Sten Hanke1Markus Bödenler2eHealth Institute, FH JOANNEUM University of Applied Sciences, 8020 Graz, AustriaeHealth Institute, FH JOANNEUM University of Applied Sciences, 8020 Graz, AustriaeHealth Institute, FH JOANNEUM University of Applied Sciences, 8020 Graz, AustriaLarge amounts of data are needed to train accurate robust machine learning models, but the acquisition of these data is complicated due to strict regulations. While many business sectors often have unused data silos, researchers face the problem of not being able to obtain a large amount of real-world data. This is especially true in the healthcare sector, since transferring these data is often associated with bureaucratic overhead because of, for example, increased security requirements and privacy laws. Federated Learning should circumvent this problem and allow training to take place directly on the data owner’s side without sending them to a central location such as a server. Currently, there exist several frameworks for this purpose such as TensorFlow Federated, Flower, or PySyft/PyGrid. These frameworks define models for both the server and client since the coordination of the training is performed by a server. Here, we present a practical method that contains a dynamic exchange of the model, so that the model is not statically stored in source code. During this process, the model architecture and training configuration are defined by the researchers and sent to the server, which passes the settings to the clients. In addition, the model is transformed by the data owner to incorporate Differential Privacy. To trace a comparison between central learning and the impact of Differential Privacy, performance and security evaluation experiments were conducted. It was found that Federated Learning can achieve results on par with centralised learning and that the use of Differential Privacy can improve the robustness of the model against Membership Inference Attacks in an honest-but-curious setting.https://www.mdpi.com/2079-9292/11/10/1530Federated LearningDifferential Privacyprivacy preservingmembership inference attack |
spellingShingle | Hannes Hilberger Sten Hanke Markus Bödenler Federated Learning with Dynamic Model Exchange Electronics Federated Learning Differential Privacy privacy preserving membership inference attack |
title | Federated Learning with Dynamic Model Exchange |
title_full | Federated Learning with Dynamic Model Exchange |
title_fullStr | Federated Learning with Dynamic Model Exchange |
title_full_unstemmed | Federated Learning with Dynamic Model Exchange |
title_short | Federated Learning with Dynamic Model Exchange |
title_sort | federated learning with dynamic model exchange |
topic | Federated Learning Differential Privacy privacy preserving membership inference attack |
url | https://www.mdpi.com/2079-9292/11/10/1530 |
work_keys_str_mv | AT hanneshilberger federatedlearningwithdynamicmodelexchange AT stenhanke federatedlearningwithdynamicmodelexchange AT markusbodenler federatedlearningwithdynamicmodelexchange |