Exploring Audio and Kinetic Sensing on Earable Devices

Chulhong Min, Akhil Mathur, Fahim Kawsar

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review

39 Citations (Scopus)

Abstract

In this paper, we explore audio and kinetic sensing on earable devices with the commercial on-the-shelf form factor. For the study, we prototyped earbud devices with a 6-axis inertial measurement unit and a microphone. We systematically investigate the differential characteristics of the audio and inertial signals to assess their feasibility in human activity recognition. Our results demonstrate that earable devices have a superior signal-to-noise ratio under the influence of motion artefacts and are less susceptible to acoustic environment noise. We then present a set of activity primitives and corresponding signal processing pipelines to showcase the capabilities of earbud devices in converting accelerometer, gyroscope, and audio signals into the targeted human activities with a mean accuracy reaching up to 88% in varying environmental conditions.

Original languageEnglish
Title of host publicationWearSys 2018
Subtitle of host publicationProceedings of the 4th ACM Workshop on Wearable Systems and Applications
Place of PublicationNew York, NY
PublisherAssociation for Computing Machinery (ACM)
Pages5-10
Number of pages6
ISBN (Print)978-1-4503-5842-2
DOIs
Publication statusPublished - 2018
Event4th ACM Workshop on Wearable Systems and Applications, WearSys 2018: The 4th ACM Workshop on Wearable Systems and Applications - Munich, Germany
Duration: 10 Jun 201810 Jun 2018
Conference number: 4th

Conference

Conference4th ACM Workshop on Wearable Systems and Applications, WearSys 2018
Abbreviated titleWearSys 2018
Country/TerritoryGermany
CityMunich
Period10/06/1810/06/18

Keywords

  • Audio sensing
  • Earable
  • Earbud
  • Kinetic sensing

Fingerprint

Dive into the research topics of 'Exploring Audio and Kinetic Sensing on Earable Devices'. Together they form a unique fingerprint.

Cite this