Abstract
Driving is a safety-critical task that predominantly relies on vision. However, visual information from the environment is sometimes degraded or absent. In other cases, visual information is available, but the driver fails to use it due to distraction or impairment. Providing drivers with real-time auditory feedback about the state of the vehicle in relation to the environment may be an appropriate means of support when visual information is compromised. In this study, we explored whether driving can be performed solely by means of artificial auditory feedback. We focused on lane keeping, a task that is vital for safe driving. Three auditory parameter sets were tested: (1) predictor time, where the volume of a continuous tone was a linear function of the predicted lateral error from the lane centre 0 s, 1 s, 2 s, or 3 s into the future; (2) feedback mode (volume feedback vs. beep-frequency feedback) and mapping (linear vs. exponential relationship between predicted error and volume/beep frequency); and (3) corner support, in which in addition to volume feedback, a beep was offered upon entering/leaving a corner, or alternatively when crossing the lane centre while driving in a corner. A dead-zone was used, whereby the volume/beep-frequency feedback was provided only when the vehicle deviated more than 0.5 m from the centre of the lane. An experiment was conducted in which participants (N = 2) steered along a track with sharp 90-degree corners in a simulator with the visual projection shut down. Results showed that without predictor feedback (i.e., 0 s prediction), participants were more likely to depart the road compared to with predictor feedback. Moreover, volume feedback resulted in fewer road departures than beep-frequency feedback. The results of this study may be used in the design of in-vehicle auditory displays. Specifically, we recommend that feedback be based on anticipated error rather than current error.
Original language | English |
---|---|
Pages (from-to) | 525-530 |
Journal | IFAC-PapersOnLine |
Volume | 49 |
Issue number | 19 |
DOIs | |
Publication status | Published - 2016 |
Event | 13th IFAC Symposium on Analysis, Design, and Evaluation of Human-Machine Systems - Kyoto, Japan Duration: 30 Aug 2016 → 2 Sep 2016 |
Keywords
- auditory display
- driver support
- driving simulator
- human-machine interface
- road safety
Datasets
-
Supplementary material for Blind Driving papers
Bazilinskyy, P. (Creator) & de Winter, J. C. F. (Creator), TU Delft - 4TU.ResearchData, 2019
DOI: 10.4121/UUID:6C02218F-CD2C-4EB2-8C93-1EEAC44690ED
Dataset/Software: Dataset
-
Supplementary material for Blind Driving papers
Bazilinskyy, P. (Creator) & de Winter, J. C. F. (Creator), TU Delft - 4TU.ResearchData, 3 May 2022
DOI: 10.4121/12702911
Dataset/Software: Dataset