FedTKD: a trustworthy heterogeneous federated learning based on adaptive knowledge distillation
Federated learning allows multiple parties to train models while jointly protecting user privacy. However, traditional federated learning requires each client to have the same model structure to fuse the global model. In real-world scenarios, each client may need to develop personalized models based...
Main Authors: | Chen, Leiming, Zhang, Weishan, Dong, Cihao, Zhao, Dehai, Zeng, Xingjie, Qiao, Sibo, Zhu, Yichang, Tan, Chee Wei |
---|---|
Other Authors: | School of Civil and Environmental Engineering |
Format: | Journal Article |
Language: | English |
Published: |
2024
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/174735 |
Similar Items
-
FedTKD: A Trustworthy Heterogeneous Federated Learning Based on Adaptive Knowledge Distillation
by: Leiming Chen, et al.
Published: (2024-01-01) -
FedDRL: trustworthy federated learning model fusion method based on staged reinforcement learning
by: Chen, Leiming, et al.
Published: (2024) -
FedRAD: Heterogeneous Federated Learning via Relational Adaptive Distillation
by: Jianwu Tang, et al.
Published: (2023-07-01) -
Federated Distillation Methodology for Label-Based Group Structures
by: Geonhee Yang, et al.
Published: (2023-12-01) -
FedDK: Improving Cyclic Knowledge Distillation for Personalized Healthcare Federated Learning
by: Yikai Xu, et al.
Published: (2023-01-01)