On Simplifying the Primal-Dual Method of Multipliers

Guoqiang Zhang, Richard Heusdens

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review

8 Citations (Scopus)
17 Downloads (Pure)

Abstract

Recently, the primal-dual method of multipliers (PDMM) has been proposed to solve a convex optimization problem defined over a general graph. In this paper, we consider simplifying PDMM for a subclass of the convex optimization problems. This subclass includes the consensus problem as a special form. By using algebra, we show that the update expressions of PDMM can be simplified significantly. We then evaluate PDMM for training a support vector machine (SVM). The experimental results indicate that PDMM converges considerably faster than the alternating direction method of multipliers (ADMM).
Original languageEnglish
Title of host publication2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
Subtitle of host publicationProceedings
EditorsMin Dong, Thomas Fang Zheng
Place of PublicationDanvers, MA
PublisherIEEE
Pages4826-4830
Number of pages5
ISBN (Electronic)978-1-4799-9988-0
DOIs
Publication statusPublished - 19 May 2016
Event2016 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2016 - Shanghai International Convention Center, Shanghai, China
Duration: 20 Mar 201625 Mar 2016

Conference

Conference2016 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2016
Abbreviated titleICASSP
CountryChina
CityShanghai
Period20/03/1625/03/16

Bibliographical note

Accepted Author Manuscript

Keywords

  • distributed optimization
  • PDMM
  • ADMM
  • SVM

Fingerprint

Dive into the research topics of 'On Simplifying the Primal-Dual Method of Multipliers'. Together they form a unique fingerprint.

Cite this