A non-parametric Bayesian approach to decompounding from high frequency data

Shota Gugushvili, Frank van der Meulen, Peter Spreij

Research output: Contribution to journalArticleScientific

6 Citations (Scopus)
42 Downloads (Pure)

Abstract

Given a sample from a discretely observed compound Poisson process, we consider non-parametric estimation of the density f0 of its jump sizes, as well as of its intensity λ0. We take a Bayesian approach to the problem and specify the prior on f0 as the Dirichlet location mixture of normal densities. An independent prior for λ0 is assumed to be compactly supported and to possess a positive density with respect to the Lebesgue measure. We show that under suitable assumptions the posterior contracts around the pair (λ0,f0) at essentially (up to a logarithmic factor) the nΔ−−−√-rate, where n is the number of observations and Δ is the mesh size at which the process is sampled. The emphasis is on high frequency data, Δ→0, but the obtained results are also valid for fixed Δ. In either case we assume that nΔ→∞. Our main result implies existence of Bayesian point estimates converging (in the frequentist sense, in probability) to (λ0,f0) at the same rate. We also discuss a practical implementation of our approach. The computational problem is dealt with by inclusion of auxiliary variables and we develop a Markov chain Monte Carlo algorithm that samples from the joint distribution of the unknown parameters in the mixture density and the introduced auxiliary variables. Numerical examples illustrate the feasibility of this approach.
Original languageEnglish
Pages (from-to)1-27
Number of pages27
JournalStatistical Inference for Stochastic Processes
DOIs
Publication statusPublished - 2016

Keywords

  • Compound Poisson process
  • Non-parametric Bayesian estimation
  • Posterior contraction rate
  • High frequency observations

Fingerprint

Dive into the research topics of 'A non-parametric Bayesian approach to decompounding from high frequency data'. Together they form a unique fingerprint.

Cite this