Selective knowledge sharing for privacy-preserving federated distillation without a good teacher

Abstract While federated learning (FL) is promising for efficient collaborative learning without revealing local data, it remains vulnerable to white-box privacy attacks, suffers from high communication overhead, and struggles to adapt to heterogeneous models. Federated distillation (FD) emerges as...

Full description

Bibliographic Details
Main Authors: Jiawei Shao, Fangzhao Wu, Jun Zhang
Format: Article
Language:English
Published: Nature Portfolio 2024-01-01
Series:Nature Communications
Online Access:https://doi.org/10.1038/s41467-023-44383-9