Distributed Gradient Tracking Methods with Finite Data Rates
2021
This paper studies the distributed optimization problem over an undirected connected graph subject to digital communications with a finite data rate, where each agent holds a strongly convex and smooth cost function. The agents need to cooperatively minimize the average of all agents’ cost functions. Each agent builds an encoder/decoder pair that produces transmitted messages to its neighbors with a finite-level uniform quantizer, and recovers its neighbors’ states by a recursive decoder with received quantized signals. Combining the adaptive encoder/decoder scheme with the gradient tracking method, the authors propose a distributed quantized algorithm. The authors prove that the optimization can be achieved at a linear rate, even when agents communicate at 1-bit data rate. Numerical examples are also conducted to illustrate theoretical results.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
33
References
0
Citations
NaN
KQI