Budget-Aware Online Control of Edge Federated Learning on Streaming Data with Stochastic Inputs

2021 
Performing federated learning continuously in edge networks while training data are dynamically and unpredictably streamed to the devices faces critical challenges, including the global model convergence, the long-term resource budget, and the uncertain stochastic network and execution environment. We formulate an integer program to capture all these challenges, which minimizes the cumulative total latency of stream learning on device and federated learning between devices and the edge server. We then decouple the problem, design an online learning algorithm for controlling the number of local model updates via a convex-concave reformulation and rectified gradient-descent steps, and design a bandit learning algorithm for selecting the edge server for global model aggregations by incorporating the budget information to strike the exploit-explore balance. We rigorously prove the sub-linear regret regarding the optimization objective and the sub-linear constraint violation regarding the maximal on-device load, while guaranteeing the convergence of the global model trained. Extensive evaluations with real-world training data and input traces confirm the empirical superiority of our approach over multiple state-of-the-art algorithms.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    30
    References
    0
    Citations
    NaN
    KQI
    []