CEEP-FL: A comprehensive approach for communication efficiency and enhanced privacy in federated learning
2021
Abstract Federated Learning (FL) is an emerging technique for collaboratively training machine learning models on distributed data under privacy constraints. However, recent studies have shown that FL significantly consumes plenty of communication resources during the global model update. In addition, participants’ private data can also be compromised by exploiting the shared parameters when uploading the local gradient updates to the central cloud server, which hinders FL to be implemented widely. To address these challenges, in this paper, we propose a novel comprehensive FL approach, namely, Communication Efficient and Enhanced Privacy (CEEP-FL). In particular, the proposed approach simultaneously aims to; (1) minimize the communication cost, (2) protect data from being compromised, and (3) maximize the global learning accuracy. To minimize the communication cost, we first apply a novel filtering mechanism on each local gradient update and upload only the important gradients. Then, we apply Non-Interactive Zero-Knowledge Proofs based Homomorphic-Cryptosystem (NIZKP-HC) in order to protect those local gradient updates while maintaining robustness in the network. Finally, we use Distributed Selective Stochastic Gradient Descent (DSSGD) optimization to minimize the computational cost and maximize the global learning accuracy. The experimental results on commonly used FL datasets demonstrate that CEEP-FL distinctively outperforms the existing approaches.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
52
References
1
Citations
NaN
KQI