Driver response times to auditory, visual, and tactile take-over requests: A simulator study with 101participants

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review

68 Citations (Scopus)
180 Downloads (Pure)

Abstract

Conditionally automated driving systems may soon be available on the market. Even though these systems exempt drivers from the driving task for extended periods of time, drivers are expected to take back control when the automation issues a so-called take-over request. This study investigated the interaction between take-over request modality and type of non-driving task, regarding the driver's reaction time. It was hypothesized that reaction times are higher when the non-driving task and the take-over request use the same modality. For example, auditory take-over requests were expected to be relatively ineffective in situations in which the driver is making a phone call. 101 participants, divided into three groups, performed one of three non-driving tasks, namely reading (i.e., visual task), calling (auditory task), or watching a video (visual/auditory task). Results showed that auditory and tactile take-over requests yielded overall faster reactions than visual take-over requests. The expected interaction between takeover modality and the dominant modality of the non-driving task was not found. As for self-reported usefulness, auditory and tactile take-over requests yielded higher scores than visual ones. In conclusion, it seems that auditory and tactile stimuli are equally effective as take-over requests, regardless of the non-driving task. Further study into the effects of realistic non-driving tasks is needed to identify which non-driving tasks are detrimental to safety in automated driving.
Original languageEnglish
Title of host publicationProceedings of the IEEE International Conference on Systems, Man, and Cybernetics (SMC 2017)
EditorsAnup Basu, Witold Pedrycz, Xenophon Zabuli
Place of PublicationPiscataway, NJ, USA
PublisherIEEE
Pages1505-1510
ISBN (Print)978-1-5386-1645-1
DOIs
Publication statusPublished - 2017
EventSMC 2017: IEEE International Conference on Systems, Man, and Cybernetics - Banff, Canada
Duration: 5 Oct 20178 Oct 2017

Conference

ConferenceSMC 2017: IEEE International Conference on Systems, Man, and Cybernetics
Country/TerritoryCanada
CityBanff
Period5/10/178/10/17

Bibliographical note

Green Open Access added to TU Delft Institutional Repository ‘You share, we take care!’ – Taverne project https://www.openaccess.nl/en/you-share-we-take-care

Otherwise as indicated in the copyright section: the publisher is the copyright holder of this work and the author uses the Dutch legislation to make this work public.

Keywords

  • Visualization
  • Vehicles
  • Automation
  • Light emitting diodes
  • Wheels
  • Roads

Fingerprint

Dive into the research topics of 'Driver response times to auditory, visual, and tactile take-over requests: A simulator study with 101participants'. Together they form a unique fingerprint.

Cite this