ProgFed: Effective, Communication, and Computation Efficient Federated Learning by Progressive Training
2021
Federated learning is a powerful distributed learning scheme that allows
numerous edge devices to collaboratively train a model without sharing their
data. However, training is resource-intensive for edge devices, and limited
network bandwidth is often the main bottleneck. Prior work often overcomes the
constraints by condensing the models or messages into compact formats, e.g., by
gradient compression or distillation. In contrast, we propose ProgFed, the
first progressive training framework for efficient and effective federated
learning. It inherently reduces computation and two-way communication costs
while maintaining the strong performance of the final models. We theoretically
prove that ProgFed converges at the same asymptotic rate as standard training
on full models. Extensive results on a broad range of architectures, including
CNNs (VGG, ResNet, ConvNets) and U-nets, and diverse tasks from simple
classification to medical image segmentation show that our highly effective
training approach saves up to $20\%$ computation and up to $63\%$ communication
costs for converged models. As our approach is also complimentary to prior work
on compression, we can achieve a wide range of trade-offs, showing reduced
communication of up to $50\times$ at only $0.1\%$ loss in utility.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
28
References
0
Citations
NaN
KQI