Trust is an important factor in building acceptance of autonomous vehicles within our society, but the complex nature of trust makes it challenging to design for an appropriate level of trust. This can lead to instances of mistrust and/or distrust between users and AV’s. Designing for calibrated trust is a possible option to address this challenge. Existing research on designing for calibrated trust focuses on the human machine interaction (HMI), while from literature we infer that trust creation beings much before the first interaction between a user and an AV. The goal of our research is to broaden the scope of calibrated trust, by exploring the pre-use phase and understand the challenges faced in calibration of trust. Within our study 16 mobility experts were interviewed and a thematic analysis of the interviews was conducted. The analysis revealed the lack of clear communication between stakeholders, a solutionism approach towards designing and lack of transparency in design as the prominent challenges. Building on the research insights, we briefly introduce the Calibrated Trust Toolkit as our design solution, and conclude by proposing a sweet spot for achieving calibration of trust between users and autonomous vehicles.
|Number of pages||10|
|Journal||Proceedings of the Design Society|
|Publication status||Published - 2021|
|Event||23rd International Conference on Engineering Design, ICED 2021 - Gothenburg, Sweden|
Duration: 16 Aug 2021 → 20 Aug 2021
Bibliographical notePublisher Copyright:
© ICED 2021.All right reserved.
Copyright 2021 Elsevier B.V., All rights reserved.
- AI Solutionism
- Artificial intelligence
- Calibrated Trust
- Design to X