SpaceDML: Enabling Distributed Machine Learning in Space Information Networks

2021 
Space information networks (SINs) have become a rapidly growing global infrastructure service. Massive volumes of high-resolution images and videos captured by low-orbit satellites and unmanned aerial vehicles have provided a rich training data source for machine learning applications. However, SIN devices' limited communication and computation resources make it challenging to perform machine learning efficiently with a swarm of SIN devices. In this article, we propose Spacedml, a distributed machine learning system for SIN platforms that applies dynamic model compression techniques to adapt distributed machine learning training to SINs' limited bandwidth and unstable connectivity. Spaced-ml has two key algorithm:s adaptive loss-aware quantization, which compresses models without sacrificing their quality, and partial weight averaging, which selectively averages active clients' partial model updates. These algorithms jointly improve communication efficiency and enhance the scalability of distributed machine learning with SIN devices. We evaluate Spacedml by training a LeNet-S model on the MNIST dataset. The experimental results show that Spacedml can increase model accuracy by 2–3 percent and reduce communication bandwidth consumption by up to 60 percent compared to the baseline algorithm.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    12
    References
    0
    Citations
    NaN
    KQI
    []