In a crowdsourced experiment, the effects of distance and type of the approaching vehicle, traffic density, and visual clutter on pedestrians’ attention distribution were explored. 966 participants viewed 107 images of diverse traffic scenes for durations between 100 and 4000 ms. Participants’ eye-gaze data were collected using the TurkEyes method. The method involved briefly showing codecharts after each image and asking the participants to type the code they saw last. The results indicate that automated vehicles were more often glanced at than manual vehicles. Measuring eye gaze without an eye tracker is promising.
|Title of host publication||Advances in Human Aspects of Transportation|
|Subtitle of host publication||Proceedings of the AHFE 2021 Virtual Conference on Human Aspects of Transportation, July 25-29, 2021, USA|
|Place of Publication||Cham, Switzerland|
|Publication status||Published - 2021|
|Event||AHFE 2021: International Conference on Applied Human Factors and Ergonomics (Virtual) - |
Duration: 25 Jul 2021 → 29 Jul 2021
|Name||Lecture Notes in Networks and Systems|
|Conference||AHFE 2021: International Conference on Applied Human Factors and Ergonomics (Virtual)|
|Period||25/07/21 → 29/07/21|
Bibliographical noteGreen Open Access added to TU Delft Institutional Repository 'You share, we take care!' - Taverne project https://www.openaccess.nl/en/you-share-we-take-care
Otherwise as indicated in the copyright section: the publisher is the copyright holder of this work and the author uses the Dutch legislation to make this work public.
- Eye gazes
- Automated driving