The rapid development of automation has led to machines increasingly taking over tasks previously reserved for human operators, especially those involving high-risk settings and moral decision making. To best benefit from the advantages of automation, these systems must be integrated into work environments, and into society as a whole. Successful integration requires understanding how users gain acceptance of technology by learning to trust in its reliability. It is thus essential to examine factors that influence the integration, acceptance, and use of automated technologies. As such, this study investigated the conditions under which human operators were willing to relinquish control, and delegate tasks to automated agents by examining risk and context factors experimentally. In a decision task, participants (N=43, 27 female) were placed in different situations in which they could choose to delegate a task to an automated agent or manual execution. The results of our experiment indicated that both, context and risk, significantly influenced people’s decisions. While it was unsurprising that the reliability of an automated agent seemed to strongly influence trust in automation, the different types of decision support systems did not appear to impact participant compliance. Our findings suggest that contextual factors should be considered when designing automated systems that navigate moral norms and individual preferences.
Liehner, Gian Luca, Philipp Brauner, Anne Kathrin Schaar, and Martina Ziefle. 2021. “Delegation of Moral Tasks to Automated Agents The Impact of Risk and Context on Trusting a Machine to Perform a Task.” IEEE Transactions on Technology and Society, 1–14. https://doi.org/10.1109/TTS.2021.3118355.