Trust with increasing and decreasing reliability
- PMID: 38445652
- PMCID: PMC11487872
- DOI: 10.1177/00187208241228636
Trust with increasing and decreasing reliability
Abstract
Objective: The primary purpose was to determine how trust changes over time when automation reliability increases or decreases. A secondary purpose was to determine how task-specific self-confidence is associated with trust and reliability level.
Background: Both overtrust and undertrust can be detrimental to system performance; therefore, the temporal dynamics of trust with changing reliability level need to be explored.
Method: Two experiments used a dominant-color identification task, where automation provided a recommendation to users, with the reliability of the recommendation changing over 300 trials. In Experiment 1, two groups of participants interacted with the system: one group started with a 50% reliable system which increased to 100%, while the other used a system that decreased from 100% to 50%. Experiment 2 included a group where automation reliability increased from 70% to 100%.
Results: Trust was initially high in the decreasing group and then declined as reliability level decreased; however, trust also declined in the 50% increasing reliability group. Furthermore, when user self-confidence increased, automation reliability had a greater influence on trust. In Experiment 2, the 70% increasing reliability group showed increased trust in the system.
Conclusion: Trust does not always track the reliability of automated systems; in particular, it is difficult for trust to recover once the user has interacted with a low reliability system.
Applications: This study provides initial evidence into the dynamics of trust for automation that gets better over time suggesting that users should only start interacting with automation when it is sufficiently reliable.
Keywords: automation; decision making; human-automation interaction; levels of automation; trust in automation.
Conflict of interest statement
Declaration of Conflicting InterestsThe author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Figures
References
-
- Akash K., Hu W.-L., Reid T., Jain N. (2017). Dynamic modeling of trust in human-machine interactions. 2017 American Control Conference (ACC) (pp. 1542–1548), Seattle, WA, USA, 24–26 May 2017. 10.23919/ACC.2017.7963172 - DOI
-
- Barnhart G., Knocton S., Hunger A., Dithurbide L., Neyedli H. (2023). Interpersonal and human-automation trust in an underwater mine detection task. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 67(1), 145–150. 10.1177/21695067231192560 - DOI
-
- Bhat S., Lyons J. B., Shi C., Yang X. J. (2022). Clustering trust dynamics in a human-robot sequential decision-making task. IEEE Robotics and Automation Letters, 7(4), 8815–8822. 10.1109/LRA.2022.3188902 - DOI
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources
