Do I trust a machine? Differences in user trust based on system performance
- Publisher:
- Springer
- Publication Type:
- Chapter
- Citation:
- Human and Machine Learning, 2018, pp. 245 - 264
- Issue Date:
- 2018
Closed Access
Filename | Description | Size | |||
---|---|---|---|---|---|
Ch12_Pages from 2018_Book_HumanAndMachineLearning-2.pdf | Published version | 664.23 kB |
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
Trust plays an important role in various user-facing systems and applications. It is particularly important in the context of decision support systems, where the system’s output serves as one of the inputs for the users’ decision making processes. In this chapter, we study the dynamics of explicit and implicit user trust in a simulated automated quality monitoring system, as a function of the system accuracy. We establish that users correctly perceive the accuracy of the system and adjust their trust accordingly. The results also show notable differences between two groups of users and indicate a possible threshold in the acceptance of the system. This important learning can be leveraged by designers of practical systems for sustaining the desired level of user trust.
Please use this identifier to cite or link to this item: