Impact of Network Topology on the Convergence of Decentralized Federated Learning Systems

2021 
Federated learning is a popular framework that enables harvesting edge resources' computational power to train a machine learning model distributively. However, it is not always feasible or profitable to have a centralized server that controls and synchronizes the training process. In this paper, we consider the problem of training a machine learning model over a network of nodes in a fully decentralized fashion. In particular, we look for empirical evidence on how sensitive is the training process for various network characteristics and communication parameters. We present the outcome of several simulations conducted with different network topologies, datasets, and machine learning models.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    22
    References
    0
    Citations
    NaN
    KQI
    []