Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2021 May 6;11(1):9687.
doi: 10.1038/s41598-021-88622-9.

Promises and trust in human-robot interaction

Affiliations

Promises and trust in human-robot interaction

Lorenzo Cominelli et al. Sci Rep. .

Abstract

Understanding human trust in machine partners has become imperative due to the widespread use of intelligent machines in a variety of applications and contexts. The aim of this paper is to investigate whether human-beings trust a social robot-i.e. a human-like robot that embodies emotional states, empathy, and non-verbal communication-differently than other types of agents. To do so, we adapt the well-known economic trust-game proposed by Charness and Dufwenberg (2006) to assess whether receiving a promise from a robot increases human-trust in it. We find that receiving a promise from the robot increases the trust of the human in it, but only for individuals who perceive the robot very similar to a human-being. Importantly, we observe a similar pattern in choices when we replace the humanoid counterpart with a real human but not when it is replaced by a computer-box. Additionally, we investigate participants' psychophysiological reaction in terms of cardiovascular and electrodermal activity. Our results highlight an increased psychophysiological arousal when the game is played with the social robot compared to the computer-box. Taken all together, these results strongly support the development of technologies enhancing the humanity of robots.

PubMed Disclaimer

Conflict of interest statement

The authors declare no competing interests.

Figures

Figure 1
Figure 1
Three types of player-B.
Figure 2
Figure 2
The game.
Figure 3
Figure 3
Marginal effect of Sympamp High on the probability of playing ’In’.
Figure 4
Figure 4
Emotional state of the robot.
Figure 5
Figure 5
Decision Rule of the robot.
Figure 6
Figure 6
The trust game in the ‘promise’ conditions with lying aversion.

References

    1. Lange PAMV. Generalized trust: four lessons from genetics and culture. Curr. Dir. Psychol. Sci. 2015;24:71–76. doi: 10.1177/0963721414552473. - DOI
    1. Fehr E. On the economics and biology of trust. J. Eur. Econ. Assoc. 2009;7:235–266. doi: 10.1162/JEEA.2009.7.2-3.235. - DOI
    1. Langevoort DC. Selling hope, selling risk: some lessons for law from behavioral economics about stockbrokers and sophisticated customers. Cal L. Rev. 1996;84:627. doi: 10.2307/3480963. - DOI
    1. Nishio S, Ogawa K, Kanakogi Y, Itakura S, Ishiguro H. Do robot appearance and speech affect people’s attitude? Evaluation through the ultimatum game. In: Ishiguro H, Dalla Libera F, editors. Geminoid Studies: Science and Technologies for Humanlike Teleoperated Androids. Springer; 2018. pp. 263–277.
    1. Picard, R. W. Toward machines with emotional intelligence. In ICINCO (Invited Speakers) 29–30 (Citeseer, 2004).

Publication types

LinkOut - more resources