Differentially private knowledge transfer for federated learning
Abstract Extracting useful knowledge from big data is important for machine learning. When data is privacy-sensitive and cannot be directly collected, federated learning is a promising option that extracts knowledge from decentralized data by learning and exchanging model parameters, rather than raw...
Main Authors: | , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Nature Portfolio
2023-06-01
|
Series: | Nature Communications |
Online Access: | https://doi.org/10.1038/s41467-023-38794-x |
_version_ | 1827916923955838976 |
---|---|
author | Tao Qi Fangzhao Wu Chuhan Wu Liang He Yongfeng Huang Xing Xie |
author_facet | Tao Qi Fangzhao Wu Chuhan Wu Liang He Yongfeng Huang Xing Xie |
author_sort | Tao Qi |
collection | DOAJ |
description | Abstract Extracting useful knowledge from big data is important for machine learning. When data is privacy-sensitive and cannot be directly collected, federated learning is a promising option that extracts knowledge from decentralized data by learning and exchanging model parameters, rather than raw data. However, model parameters may encode not only non-private knowledge but also private information of local data, thereby transferring knowledge via model parameters is not privacy-secure. Here, we present a knowledge transfer method named PrivateKT, which uses actively selected small public data to transfer high-quality knowledge in federated learning with privacy guarantees. We verify PrivateKT on three different datasets, and results show that PrivateKT can maximally reduce 84% of the performance gap between centralized learning and existing federated learning methods under strict differential privacy restrictions. PrivateKT provides a potential direction to effective and privacy-preserving knowledge transfer in machine intelligent systems. |
first_indexed | 2024-03-13T03:20:33Z |
format | Article |
id | doaj.art-d09bf796b5d242f29a68b4e82a633653 |
institution | Directory Open Access Journal |
issn | 2041-1723 |
language | English |
last_indexed | 2024-03-13T03:20:33Z |
publishDate | 2023-06-01 |
publisher | Nature Portfolio |
record_format | Article |
series | Nature Communications |
spelling | doaj.art-d09bf796b5d242f29a68b4e82a6336532023-06-25T11:22:07ZengNature PortfolioNature Communications2041-17232023-06-011411910.1038/s41467-023-38794-xDifferentially private knowledge transfer for federated learningTao Qi0Fangzhao Wu1Chuhan Wu2Liang He3Yongfeng Huang4Xing Xie5Department of Electronic Engineering, Tsinghua UniversityMicrosoft Research AsiaDepartment of Electronic Engineering, Tsinghua UniversityDepartment of Electronic Engineering, Tsinghua UniversityDepartment of Electronic Engineering, Tsinghua UniversityMicrosoft Research AsiaAbstract Extracting useful knowledge from big data is important for machine learning. When data is privacy-sensitive and cannot be directly collected, federated learning is a promising option that extracts knowledge from decentralized data by learning and exchanging model parameters, rather than raw data. However, model parameters may encode not only non-private knowledge but also private information of local data, thereby transferring knowledge via model parameters is not privacy-secure. Here, we present a knowledge transfer method named PrivateKT, which uses actively selected small public data to transfer high-quality knowledge in federated learning with privacy guarantees. We verify PrivateKT on three different datasets, and results show that PrivateKT can maximally reduce 84% of the performance gap between centralized learning and existing federated learning methods under strict differential privacy restrictions. PrivateKT provides a potential direction to effective and privacy-preserving knowledge transfer in machine intelligent systems.https://doi.org/10.1038/s41467-023-38794-x |
spellingShingle | Tao Qi Fangzhao Wu Chuhan Wu Liang He Yongfeng Huang Xing Xie Differentially private knowledge transfer for federated learning Nature Communications |
title | Differentially private knowledge transfer for federated learning |
title_full | Differentially private knowledge transfer for federated learning |
title_fullStr | Differentially private knowledge transfer for federated learning |
title_full_unstemmed | Differentially private knowledge transfer for federated learning |
title_short | Differentially private knowledge transfer for federated learning |
title_sort | differentially private knowledge transfer for federated learning |
url | https://doi.org/10.1038/s41467-023-38794-x |
work_keys_str_mv | AT taoqi differentiallyprivateknowledgetransferforfederatedlearning AT fangzhaowu differentiallyprivateknowledgetransferforfederatedlearning AT chuhanwu differentiallyprivateknowledgetransferforfederatedlearning AT lianghe differentiallyprivateknowledgetransferforfederatedlearning AT yongfenghuang differentiallyprivateknowledgetransferforfederatedlearning AT xingxie differentiallyprivateknowledgetransferforfederatedlearning |