FedMDR: Federated Model Distillation with Robust Aggregation

2021 
This paper presents FedMDR, a federated model distillation framework with a novel, robust aggregation mechanism that exploits transfer learning and knowledge distillation. FedMDR adopts a weighted geometric-median-based aggregation with trimmed prediction accuracy on the server-side, which orchestrates communication-efficient training on both heterogeneous model architectures and non-i.i.d. data. The aggregation provides resilience to sharp accuracy drop of corrupted models. We also extend FedMDR to support differential privacy by adding Gaussian noise to the aggregated consensus. Results show that FedMDR achieves significant robustness gain and satisfactory accuracy, and outperforms the existing techniques.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    15
    References
    0
    Citations
    NaN
    KQI
    []