Privacy-Preserving Distributed Optimization via Subspace Perturbation: A General Framework

Qiongxiu Li, R. Heusdens, M. Graesboll Christensen

Research output: Contribution to journalArticleScientificpeer-review

20 Citations (Scopus)

Abstract

As the modern world becomes increasingly digitized and interconnected, distributed signal processing has proven to be effective in processing its large volume of data. However, a main challenge limiting the broad use of distributed signal processing techniques is the issue of privacy in handling sensitive data. To address this privacy issue, we propose a novel yet general subspace perturbation method for privacy-preserving distributed optimization, which allows each node to obtain the desired solution while protecting its private data. In particular, we show that the dual variable introduced in each distributed optimizer will not converge in a certain subspace determined by the graph topology. Additionally, the optimization variable is ensured to converge to the desired solution, because it is orthogonal to this non-convergent subspace. We therefore propose to insert noise in the non-convergent subspace through the dual variable such that the private data are protected, and the accuracy of the desired solution is completely unaffected. Moreover, the proposed method is shown to be secure under two widely-used adversary models: passive and eavesdropping. Furthermore, we consider several distributed optimizers such as ADMM and PDMM to demonstrate the general applicability of the proposed method. Finally, we test the performance through a set of applications. Numerical tests indicate that the proposed method is superior to existing methods in terms of several parameters like estimated accuracy, privacy level, communication cost and convergence rate.
Original languageEnglish
Pages (from-to)5983-5996
Number of pages14
JournalIEEE Trans. Signal Processing
Volume68
DOIs
Publication statusPublished - 2020

Fingerprint

Dive into the research topics of 'Privacy-Preserving Distributed Optimization via Subspace Perturbation: A General Framework'. Together they form a unique fingerprint.

Cite this