Differences in Social Expectations About Robot Signals and Human Signals
- PMID: 38133602
- DOI: 10.1111/cogs.13393
Differences in Social Expectations About Robot Signals and Human Signals
Abstract
In our daily lives, we are continually involved in decision-making situations, many of which take place in the context of social interaction. Despite the ubiquity of such situations, there remains a gap in our understanding of how decision-making unfolds in social contexts, and how communicative signals, such as social cues and feedback, impact the choices we make. Interestingly, there is a new social context to which humans are recently increasingly more frequently exposed-social interaction with not only other humans but also artificial agents, such as robots or avatars. Given these new technological developments, it is of great interest to address the question of whether-and in what way-social signals exhibited by non-human agents influence decision-making. The present study aimed to examine whether robot non-verbal communicative behavior has an effect on human decision-making. To this end, we implemented a two-alternative-choice task where participants were to guess which of two presented cups was covering a ball. This game was an adaptation of a "Shell Game." A robot avatar acted as a game partner producing social cues and feedback. We manipulated robot's cues (pointing toward one of the cups) before the participant's decision and the robot's feedback ("thumb up" or no feedback) after the decision. We found that participants were slower (compared to other conditions) when cues were mostly invalid and the robot reacted positively to wins. We argue that this was due to the incongruence of the signals (cue vs. feedback), and thus violation of expectations. In sum, our findings show that incongruence in pre- and post-decision social signals from a robot significantly influences task performance, highlighting the importance of understanding expectations toward social robots for effective human-robot interactions.
Keywords: Human-robot interaction; Non-verbal communication; Response time; Social cues; Social decision-making.
© 2023 The Authors. Cognitive Science published by Wiley Periodicals LLC on behalf of Cognitive Science Society (CSS).
References
-
- Abubshait, A., & Wiese, E. (2017). You look human, but act like a machine: Agent appearance and behavior modulate different aspects of human-robot interaction. Frontiers in Psychology, 8, 1393.
-
- Abubshait, A., Momen, A., & Wiese, E. (2020). Pre-exposure to ambiguous faces modulates top-down control of attentional orienting to counterpredictive gaze cues. Frontiers in Psychology, 11, 2234.
-
- Abubshait, A., Beatty, P. J., McDonald, C. G., Hassall, C. D., Krigolson, O. E., & Wiese, E. (2021). A win-win situation: Does familiarity with a social robot modulate feedback monitoring and learning? Cognitive, Affective, & Behavioral Neuroscience, 21, 763-775.
-
- Abubshait, A., Pérez-Osorio, J., De Tommaso, D., & Wykowska, A. (2023). Conflicting demands during a handover task with a robot affect the neural pattern of the human cognitive system. Available at: https://doi.org/10.31219/osf.io/derq7 accessed on September 10, 2023.
-
- Admoni, H., & Scassellati, B. (2017). Social eye gaze in human-robot interaction: A review. Journal of Human-Robot Interaction, 6(1), 25-63.
Publication types
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources
Miscellaneous