LaF: Lattice-Based and Communication-Efficient Federated Learning

2022 
Federated learning is an emerging technology which allows a server to train a global model with the cooperation of participants without exposing the participants’ data. In recent years, there have been many studies focusing on maintaining participant privacy against honest-but-curious servers. In 2017, Google proposed a promising solution that applies double masking and secret sharing tools to protect participants’ gradients for each round of federated learning (CCS’17). However, this solution fails to achieve post-quantum security and costs high communication overhead to distribute secret shares. To address this problem, this work designs a lattice-based multi-use secret sharing scheme to avoid distributing new secret shares to all participants in each round of federated learning while achieving post-quantum security. In other words, this new tool allows each participant to update his secret shares locally while maintaining the privacy of participants’ gradients against quantum attacks. Finally, this work applies this new secret sharing technique to construct a lattice-based federated learning protocl LaF . The theoretic analysis demonstrates that LaF saves a lot of communication costs compared with Google’s solution, and the experimental results show that LaF achieves higher runtime efficiency.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    42
    References
    0
    Citations
    NaN
    KQI
    []