A systematic study of unsupervised domain adaptation for robust human-activity recognition

Youngjae Chang, Akhil Mathur, Anton Isopoussu, Junehwa Song, Fahim Kawsar

Research output: Contribution to journalArticleScientificpeer-review

2 Citations (Scopus)

Abstract

Wearable sensors are increasingly becoming the primary interface for monitoring human activities. However, in order to scale human activity recognition (HAR) using wearable sensors to million of users and devices, it is imperative that HAR computational models are robust against real-world heterogeneity in inertial sensor data. In this paper, we study the problem of wearing diversity which pertains to the placement of the wearable sensor on the human body, and demonstrate that even state-of-the-art deep learning models are not robust against these factors. The core contribution of the paper lies in presenting a first-of-its-kind in-depth study of unsupervised domain adaptation (UDA) algorithms in the context of wearing diversity - we develop and evaluate three adaptation techniques on four HAR datasets to evaluate their relative performance towards addressing the issue of wearing diversity. More importantly, we also do a careful analysis to learn the downsides of each UDA algorithm and uncover several implicit data-related assumptions without which these algorithms suffer a major degradation in accuracy. Taken together, our experimental findings caution against using UDA as a silver bullet for adapting HAR models to new domains, and serve as practical guidelines for HAR practitioners as well as pave the way for future research on domain adaptation in HAR.

Original languageEnglish
Article number3380985
Number of pages30
JournalProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
Volume4
Issue number1
DOIs
Publication statusPublished - 2020

Keywords

  • Human Activity Recognition
  • Unsupervised Domain Adaptation
  • Wearing Diversity

Fingerprint Dive into the research topics of 'A systematic study of unsupervised domain adaptation for robust human-activity recognition'. Together they form a unique fingerprint.

Cite this