Selective knowledge sharing for privacy-preserving federated distillation without a good teacher
Abstract While federated learning (FL) is promising for efficient collaborative learning without revealing local data, it remains vulnerable to white-box privacy attacks, suffers from high communication overhead, and struggles to adapt to heterogeneous models. Federated distillation (FD) emerges as...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Nature Portfolio
2024-01-01
|
Series: | Nature Communications |
Online Access: | https://doi.org/10.1038/s41467-023-44383-9 |