Reducing Model Cost Based on the Weights of Each Layer for Federated Learning Clustering

2021 
Federated Learning (FL) has a different learning framework from existing machine learning, which had to centralize training data. Federated learning has the advantage of protecting privacy because learning is performed on each client device rather than the central server, and only the weight parameter values, which are the learning results, are sent to the central server. However, the performance of federated learning shows relatively low performance compared to cloud computing, and in reality, it is difficult to build a federated learning environment due to the high communication cost between the server and multiple clients. In this paper, we propose Federated Learning with Clustering algorithms (FLC). The proposed FLC is a method of clustering clients with similar characteristics by analyzing the weights of each layer of a machine learning model, and performing federated learning among the clustered clients. The proposed FLC can reduce the communication cost for each model by reducing the number of clients corresponding to each model. As a result of extensive simulation, it is confirmed that the accuracy is improved by 2.4% and the loss by 47% through the proposed FLC compared to the standard federated learning.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    7
    References
    0
    Citations
    NaN
    KQI
    []