In this work, we consider the distributed optimization problem using networked computing machines. Specifically, we are interested in solving this problem using the alternating direction method of multipliers (ADMM) while accounting for edge weights. Existing works focus on star graphs and use simple heuristics for other types of graphs. The present work shows that the optimal edge weights design is equivalent to the preconditioning matrix of ADMM that leads to the fastest convergence speed. Based on a tight convergence rate of ADMM, we show that the preconditioning matrix of general graphs can be found by minimizing the ratio of the largest and smallest nonzero eigenvalue of the graph Laplacian. Numerical experiments show that preconditioned ADMM converges much faster to a certain accuracy with less communication rounds, and exemplify the robustness to topology changes of the underlying network.
|Original language||English (US)|
|Title of host publication||2020 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2020 - Proceedings|
|Publisher||Institute of Electrical and Electronics Engineers Inc.|
|Number of pages||5|
|State||Published - May 2020|
|Event||2020 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2020 - Barcelona, Spain|
Duration: May 4 2020 → May 8 2020
|Name||ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings|
|Conference||2020 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2020|
|Period||5/4/20 → 5/8/20|
Bibliographical noteFunding Information:
This work is supported by NSF grant 1901134.
- Decentralized optimization
- hybrid ADMM