Robust censoring using metropolis-hastings sampling

G Kail, SP Chepuri, G Leus

Research output: Contribution to journalArticleScientificpeer-review

2 Citations (Scopus)

Abstract

The tasks of online data reduction and outlier rejection are both of high interest when large amounts of data are to be processed for inference. Rather than performing these tasks separately, we propose a joint approach, i.e., robust censoring. We formulate the problem as a non-convex optimization problem based on the data model for outlier-free data, without requiring prior model assumptions about the outlier perturbations. Moreover, our approach is general in that it is not restricted to any specific data model and does not rely on linearity, uncorrelated measurements, or additive Gaussian noise. For a given desired compression rate, the choice of the reduced dataset is optimal in the sense that it jointly maximizes the likelihood together with the inferred model parameters. An extension of the problem formulation allows for taking the average estimation performance into account in a hybrid optimality criterion. To solve the problem of robust censoring, we propose a Metropolis-Hastings sampler method that operates on small subsets of the data, thus limiting the computational complexity. As a practical example, the problem is specialized to the application of robust censoring for target localization. Simulation results confirm the superiority of the proposed method compared to other approaches.
Original languageEnglish
Pages (from-to)270-283
Number of pages14
JournalIEEE Journal of Selected Topics in Signal Processing
Volume10
Issue number2
DOIs
Publication statusPublished - 4 Dec 2015

Keywords

  • Big data
  • censoring
  • Markov chain Monte Carlo method
  • Metropolis-Hastings sampler
  • outlier rejection
  • robustness
  • sparse sensing

Fingerprint Dive into the research topics of 'Robust censoring using metropolis-hastings sampling'. Together they form a unique fingerprint.

Cite this