A Differential Privacy Strategy Based on Local Features of Non-Gaussian Noise in Federated Learning

As an emerging artificial intelligence technology, federated learning plays a significant role in privacy preservation in machine learning, although its main objective is to prevent peers from peeping data. However, attackers from the outside can steal metadata in transit and through data reconstruc...

Full description

Bibliographic Details
Main Authors: Xinyi Wang, Jincheng Wang, Xue Ma, Chenglin Wen
Format: Article
Language:English
Published: MDPI AG 2022-03-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/22/7/2424
_version_ 1797437760652443648
author Xinyi Wang
Jincheng Wang
Xue Ma
Chenglin Wen
author_facet Xinyi Wang
Jincheng Wang
Xue Ma
Chenglin Wen
author_sort Xinyi Wang
collection DOAJ
description As an emerging artificial intelligence technology, federated learning plays a significant role in privacy preservation in machine learning, although its main objective is to prevent peers from peeping data. However, attackers from the outside can steal metadata in transit and through data reconstruction or other techniques to obtain the original data, which poses a great threat to the security of the federated learning system. In this paper, we propose a differential privacy strategy including encryption and decryption methods based on local features of non-Gaussian noise, which aggregates the noisy metadata through a sequential Kalman filter in federated learning scenarios to increase the reliability of the federated learning method. We name the local features of non-Gaussian noise as the non-Gaussian noise fragments. Compared with the traditional methods, the proposed method shows stronger security performance for two reasons. Firstly, non-Gaussian noise fragments contain more complex statistics, making them more difficult for attackers to identify. Secondly, in order to obtain accurate statistical features, attackers must aggregate all of the noise fragments, which is very difficult due to the increasing number of clients. We conduct experiments that demonstrate that the proposed method can greatly enhanced the system’s security.
first_indexed 2024-03-09T11:27:15Z
format Article
id doaj.art-26847d62c203438a81433a99aa938695
institution Directory Open Access Journal
issn 1424-8220
language English
last_indexed 2024-03-09T11:27:15Z
publishDate 2022-03-01
publisher MDPI AG
record_format Article
series Sensors
spelling doaj.art-26847d62c203438a81433a99aa9386952023-11-30T23:58:43ZengMDPI AGSensors1424-82202022-03-01227242410.3390/s22072424A Differential Privacy Strategy Based on Local Features of Non-Gaussian Noise in Federated LearningXinyi Wang0Jincheng Wang1Xue Ma2Chenglin Wen3School of Automation, Hangzhou Dianzi University, Hangzhou 310018, ChinaSchool of Automation, Hangzhou Dianzi University, Hangzhou 310018, ChinaSchool of Automation, Hangzhou Dianzi University, Hangzhou 310018, ChinaSchool of Automation, Guangdong University of Petrochemical Technology, Maoming 525000, ChinaAs an emerging artificial intelligence technology, federated learning plays a significant role in privacy preservation in machine learning, although its main objective is to prevent peers from peeping data. However, attackers from the outside can steal metadata in transit and through data reconstruction or other techniques to obtain the original data, which poses a great threat to the security of the federated learning system. In this paper, we propose a differential privacy strategy including encryption and decryption methods based on local features of non-Gaussian noise, which aggregates the noisy metadata through a sequential Kalman filter in federated learning scenarios to increase the reliability of the federated learning method. We name the local features of non-Gaussian noise as the non-Gaussian noise fragments. Compared with the traditional methods, the proposed method shows stronger security performance for two reasons. Firstly, non-Gaussian noise fragments contain more complex statistics, making them more difficult for attackers to identify. Secondly, in order to obtain accurate statistical features, attackers must aggregate all of the noise fragments, which is very difficult due to the increasing number of clients. We conduct experiments that demonstrate that the proposed method can greatly enhanced the system’s security.https://www.mdpi.com/1424-8220/22/7/2424federated learning (FL)differential privacyKalman filternon-Gaussian noise
spellingShingle Xinyi Wang
Jincheng Wang
Xue Ma
Chenglin Wen
A Differential Privacy Strategy Based on Local Features of Non-Gaussian Noise in Federated Learning
Sensors
federated learning (FL)
differential privacy
Kalman filter
non-Gaussian noise
title A Differential Privacy Strategy Based on Local Features of Non-Gaussian Noise in Federated Learning
title_full A Differential Privacy Strategy Based on Local Features of Non-Gaussian Noise in Federated Learning
title_fullStr A Differential Privacy Strategy Based on Local Features of Non-Gaussian Noise in Federated Learning
title_full_unstemmed A Differential Privacy Strategy Based on Local Features of Non-Gaussian Noise in Federated Learning
title_short A Differential Privacy Strategy Based on Local Features of Non-Gaussian Noise in Federated Learning
title_sort differential privacy strategy based on local features of non gaussian noise in federated learning
topic federated learning (FL)
differential privacy
Kalman filter
non-Gaussian noise
url https://www.mdpi.com/1424-8220/22/7/2424
work_keys_str_mv AT xinyiwang adifferentialprivacystrategybasedonlocalfeaturesofnongaussiannoiseinfederatedlearning
AT jinchengwang adifferentialprivacystrategybasedonlocalfeaturesofnongaussiannoiseinfederatedlearning
AT xuema adifferentialprivacystrategybasedonlocalfeaturesofnongaussiannoiseinfederatedlearning
AT chenglinwen adifferentialprivacystrategybasedonlocalfeaturesofnongaussiannoiseinfederatedlearning
AT xinyiwang differentialprivacystrategybasedonlocalfeaturesofnongaussiannoiseinfederatedlearning
AT jinchengwang differentialprivacystrategybasedonlocalfeaturesofnongaussiannoiseinfederatedlearning
AT xuema differentialprivacystrategybasedonlocalfeaturesofnongaussiannoiseinfederatedlearning
AT chenglinwen differentialprivacystrategybasedonlocalfeaturesofnongaussiannoiseinfederatedlearning