Supporting Real-Time Contextual Inquiry Through Sensor Data

Katerina Gorkovenko, Dan Burnett, D.S. Murray-Rust, J Thorp, Daniel Richards

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review

Abstract

A key challenge in carrying out product design research is obtaining rich contextual information about use in the wild. We present a method that algorithmically mediates between participants, researchers, and objects in order to enable real-time collaborative sensemaking. It facilitates contextual inquiry, revealing behaviours and motivations that frame product use in the wild. In particular, we are interested in developing a practice of use driven design, where products become research tools that generate design insights grounded in user experiences. The value of this method was explored through the deployment of a collection of Bluetooth speakers that capture and stream live data to remote but co-present researchers about their movementand operation. Researchers monitored a visualisation of the real-time data to build up a picture of how the speakers were being used, responding to moments of activity within the data, initiating text conversations and prompting participants to capture photos and video. Based on the findings of this explorative study, we discuss the value of this method, how it compares to contemporary research practices, and the potential of machine learning to scale it up for use within industrial contexts. As greater agency is given to both objects and algorithms, we explore ways to empower ethnographers and participants to actively collaborate within remote real-time research
Original languageEnglish
Title of host publicationEPIC 2019 : Ethnographic Praxis in Industry Conference Proceedings
Pages554-581
Number of pages28
DOIs
Publication statusPublished - 2019

Fingerprint

Dive into the research topics of 'Supporting Real-Time Contextual Inquiry Through Sensor Data'. Together they form a unique fingerprint.

Cite this