Privacy-Preserving Distributed Deep Learning via Homomorphic Re-Encryption
The flourishing deep learning on distributed training datasets arouses worry about data privacy. The recent work related to privacy-preserving distributed deep learning is based on the assumption that the server and any learning participant do not collude. Once they collude, the server could decrypt...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2019-04-01
|
Series: | Electronics |
Subjects: | |
Online Access: | https://www.mdpi.com/2079-9292/8/4/411 |
_version_ | 1798040308193165312 |
---|---|
author | Fengyi Tang Wei Wu Jian Liu Huimei Wang Ming Xian |
author_facet | Fengyi Tang Wei Wu Jian Liu Huimei Wang Ming Xian |
author_sort | Fengyi Tang |
collection | DOAJ |
description | The flourishing deep learning on distributed training datasets arouses worry about data privacy. The recent work related to privacy-preserving distributed deep learning is based on the assumption that the server and any learning participant do not collude. Once they collude, the server could decrypt and get data of all learning participants. Moreover, since the private keys of all learning participants are the same, a learning participant must connect to the server via a distinct TLS/SSL secure channel to avoid leaking data to other learning participants. To fix these problems, we propose a privacy-preserving distributed deep learning scheme with the following improvements: (1) no information is leaked to the server even if any learning participant colludes with the server; (2) learning participants do not need different secure channels to communicate with the server; and (3) the deep learning model accuracy is higher. We achieve them by introducing a key transform server and using homomorphic re-encryption in asynchronous stochastic gradient descent applied to deep learning. We show that our scheme adds tolerable communication cost to the deep learning system, but achieves more security properties. The computational cost of learning participants is similar. Overall, our scheme is a more secure and more accurate deep learning scheme for distributed learning participants. |
first_indexed | 2024-04-11T22:05:49Z |
format | Article |
id | doaj.art-1d4b0c2de6e74b37a937114fb6a46e72 |
institution | Directory Open Access Journal |
issn | 2079-9292 |
language | English |
last_indexed | 2024-04-11T22:05:49Z |
publishDate | 2019-04-01 |
publisher | MDPI AG |
record_format | Article |
series | Electronics |
spelling | doaj.art-1d4b0c2de6e74b37a937114fb6a46e722022-12-22T04:00:43ZengMDPI AGElectronics2079-92922019-04-018441110.3390/electronics8040411electronics8040411Privacy-Preserving Distributed Deep Learning via Homomorphic Re-EncryptionFengyi Tang0Wei Wu1Jian Liu2Huimei Wang3Ming Xian4College of Electronic Science and Technology, National University of Defense Technology, Changsha 410073, ChinaCollege of Electronic Science and Technology, National University of Defense Technology, Changsha 410073, ChinaCollege of Electronic Science and Technology, National University of Defense Technology, Changsha 410073, ChinaCollege of Electronic Science and Technology, National University of Defense Technology, Changsha 410073, ChinaCollege of Electronic Science and Technology, National University of Defense Technology, Changsha 410073, ChinaThe flourishing deep learning on distributed training datasets arouses worry about data privacy. The recent work related to privacy-preserving distributed deep learning is based on the assumption that the server and any learning participant do not collude. Once they collude, the server could decrypt and get data of all learning participants. Moreover, since the private keys of all learning participants are the same, a learning participant must connect to the server via a distinct TLS/SSL secure channel to avoid leaking data to other learning participants. To fix these problems, we propose a privacy-preserving distributed deep learning scheme with the following improvements: (1) no information is leaked to the server even if any learning participant colludes with the server; (2) learning participants do not need different secure channels to communicate with the server; and (3) the deep learning model accuracy is higher. We achieve them by introducing a key transform server and using homomorphic re-encryption in asynchronous stochastic gradient descent applied to deep learning. We show that our scheme adds tolerable communication cost to the deep learning system, but achieves more security properties. The computational cost of learning participants is similar. Overall, our scheme is a more secure and more accurate deep learning scheme for distributed learning participants.https://www.mdpi.com/2079-9292/8/4/411data privacydistributed databasesmachine learning |
spellingShingle | Fengyi Tang Wei Wu Jian Liu Huimei Wang Ming Xian Privacy-Preserving Distributed Deep Learning via Homomorphic Re-Encryption Electronics data privacy distributed databases machine learning |
title | Privacy-Preserving Distributed Deep Learning via Homomorphic Re-Encryption |
title_full | Privacy-Preserving Distributed Deep Learning via Homomorphic Re-Encryption |
title_fullStr | Privacy-Preserving Distributed Deep Learning via Homomorphic Re-Encryption |
title_full_unstemmed | Privacy-Preserving Distributed Deep Learning via Homomorphic Re-Encryption |
title_short | Privacy-Preserving Distributed Deep Learning via Homomorphic Re-Encryption |
title_sort | privacy preserving distributed deep learning via homomorphic re encryption |
topic | data privacy distributed databases machine learning |
url | https://www.mdpi.com/2079-9292/8/4/411 |
work_keys_str_mv | AT fengyitang privacypreservingdistributeddeeplearningviahomomorphicreencryption AT weiwu privacypreservingdistributeddeeplearningviahomomorphicreencryption AT jianliu privacypreservingdistributeddeeplearningviahomomorphicreencryption AT huimeiwang privacypreservingdistributeddeeplearningviahomomorphicreencryption AT mingxian privacypreservingdistributeddeeplearningviahomomorphicreencryption |