A Process Pattern Model for Tackling and Improving Big Data Quality

Agung Wahyudi*, George Kuk, Marijn Janssen

*Corresponding author for this work

Research output: Contribution to journalArticleScientificpeer-review

44 Citations (Scopus)
84 Downloads (Pure)

Abstract

Data seldom create value by themselves. They need to be linked and combined from multiple sources, which can often come with variable data quality. The task of improving data quality is a recurring challenge. In this paper, we use a case study of a large telecom company to develop a generic process pattern model for improving data quality. The process pattern model is defined as a proven series of activities, aimed at improving the data quality given a certain context, a particular objective, and a specific set of initial conditions. Four different patterns are derived to deal with the variations in data quality of datasets. Instead of having to find the way to improve the quality of big data for each situation, the process model provides data users with generic patterns, which can be used as a reference model to improve big data quality.

Original languageEnglish
Pages (from-to)1-13
Number of pages13
JournalInformation Systems Frontiers: a journal of research and innovation
DOIs
Publication statusPublished - 2018

Keywords

  • Big data
  • Data processing
  • Data quality
  • Information quality
  • Process patterns
  • Reference model telecom

Fingerprint

Dive into the research topics of 'A Process Pattern Model for Tackling and Improving Big Data Quality'. Together they form a unique fingerprint.

Cite this