Abstract
In this paper, we present a novel derivation of an existing algorithm for distributed optimization termed the primal-dual method of multipliers (PDMM). In contrast to its initial derivation, monotone operator theory is used to connect PDMM with other first-order methods such as Douglas-Rachford splitting and the alternating direction method of multipliers, thus, providing insight into its operation. In particular, we show how PDMM combines a lifted dual form in conjunction with Peaceman-Rachford splitting to facilitate distributed optimization in undirected networks. We additionally demonstrate sufficient conditions for primal convergence for strongly convex differentiable functions and strengthen this result for strongly convex functions with Lipschitz continuous gradients by introducing a primal geometric convergence bound.
Original language | English |
---|---|
Article number | 8496887 |
Pages (from-to) | 334-347 |
Number of pages | 14 |
Journal | IEEE Transactions on Signal and Information Processing over Networks |
Volume | 5 |
Issue number | 2 |
DOIs | |
Publication status | Published - 2019 |
Bibliographical note
Accepted author manuscriptKeywords
- distributed optimization
- monotone operator
- Primal-Dual method of multipliers (PDMM)