I Reach Faster When I See You Look: Gaze Effects in Human-Human and Human-Robot Face-to-Face Cooperation
- PMID: 22563315
- PMCID: PMC3342577
- DOI: 10.3389/fnbot.2012.00003
I Reach Faster When I See You Look: Gaze Effects in Human-Human and Human-Robot Face-to-Face Cooperation
Abstract
Human-human interaction in natural environments relies on a variety of perceptual cues. Humanoid robots are becoming increasingly refined in their sensorimotor capabilities, and thus should now be able to manipulate and exploit these social cues in cooperation with their human partners. Previous studies have demonstrated that people follow human and robot gaze, and that it can help them to cope with spatially ambiguous language. Our goal is to extend these findings into the domain of action, to determine how human and robot gaze can influence the speed and accuracy of human action. We report on results from a human-human cooperation experiment demonstrating that an agent's vision of her/his partner's gaze can significantly improve that agent's performance in a cooperative task. We then implement a heuristic capability to generate such gaze cues by a humanoid robot that engages in the same cooperative interaction. The subsequent human-robot experiments demonstrate that a human agent can indeed exploit the predictive gaze of their robot partner in a cooperative task. This allows us to render the humanoid robot more human-like in its ability to communicate with humans. The long term objectives of the work are thus to identify social cooperation cues, and to validate their pertinence through implementation in a cooperative robot. The current research provides the robot with the capability to produce appropriate speech and gaze cues in the context of human-robot cooperation tasks. Gaze is manipulated in three conditions: Full gaze (coordinated eye and head), eyes hidden with sunglasses, and head fixed. We demonstrate the pertinence of these cues in terms of statistical measures of action times for humans in the context of a cooperative task, as gaze significantly facilitates cooperation as measured by human response times.
Keywords: cooperation; gaze; human–human interaction; human–robot interaction.
Figures









Similar articles
-
See You See Me: the Role of Eye Contact in Multimodal Human-Robot Interaction.ACM Trans Interact Intell Syst. 2016 May;6(1):2. doi: 10.1145/2882970. ACM Trans Interact Intell Syst. 2016. PMID: 28966875 Free PMC article.
-
Toward an Attentive Robotic Architecture: Learning-Based Mutual Gaze Estimation in Human-Robot Interaction.Front Robot AI. 2022 Mar 7;9:770165. doi: 10.3389/frobt.2022.770165. eCollection 2022. Front Robot AI. 2022. PMID: 35321344 Free PMC article.
-
Robot Gaze Behavior Affects Honesty in Human-Robot Interaction.Front Artif Intell. 2021 May 11;4:663190. doi: 10.3389/frai.2021.663190. eCollection 2021. Front Artif Intell. 2021. PMID: 34046585 Free PMC article.
-
Human-machine-human interaction in motor control and rehabilitation: a review.J Neuroeng Rehabil. 2021 Dec 27;18(1):183. doi: 10.1186/s12984-021-00974-5. J Neuroeng Rehabil. 2021. PMID: 34961530 Free PMC article. Review.
-
Would life be better as a robot?Sci Robot. 2023 Dec 20;8(85):eadn0643. doi: 10.1126/scirobotics.adn0643. Epub 2023 Dec 20. Sci Robot. 2023. PMID: 38117870 Review.
Cited by
-
Perceiving what you intend to do from what you do: evidence for embodiment in social interactions.Socioaffect Neurosci Psychol. 2015 Aug 4;5:28602. doi: 10.3402/snp.v5.28602. eCollection 2015. Socioaffect Neurosci Psychol. 2015. PMID: 26246478 Free PMC article.
-
Learning Semantics of Gestural Instructions for Human-Robot Collaboration.Front Neurorobot. 2018 Mar 19;12:7. doi: 10.3389/fnbot.2018.00007. eCollection 2018. Front Neurorobot. 2018. PMID: 29615888 Free PMC article.
-
Directing Attention Through Gaze Hints Improves Task Solving in Human-Humanoid Interaction.Int J Soc Robot. 2018;10(3):343-355. doi: 10.1007/s12369-018-0473-8. Epub 2018 Apr 6. Int J Soc Robot. 2018. PMID: 30996753 Free PMC article.
-
Humans Can't Resist Robot Eyes - Reflexive Cueing With Pseudo-Social Stimuli.Front Robot AI. 2022 Mar 23;9:848295. doi: 10.3389/frobt.2022.848295. eCollection 2022. Front Robot AI. 2022. PMID: 37274454 Free PMC article.
-
Contribution of Developmental Psychology to the Study of Social Interactions: Some Factors in Play, Joint Attention and Joint Action and Implications for Robotics.Front Psychol. 2018 Oct 19;9:1992. doi: 10.3389/fpsyg.2018.01992. eCollection 2018. Front Psychol. 2018. PMID: 30405484 Free PMC article. Review.
References
-
- Argyle M., Cook M. (1976). Gaze and Mutual Gaze. Oxford: Cambridge University Press
-
- Bailly G., Raidt S., Elisei F. (2010). Gaze, conversational agents and face-to-face communication. Speech Commun. 52, 598–61210.1016/j.specom.2010.02.015 - DOI
-
- Berez A. L. (2007). Eudico linguistic annotator (Elan). Lang. Document. Conserv. 1, 283–289
-
- Boersma P., Weenink D. (2007). Praat: Doing Phonetics by Computer (Version 5.1.08) (Computer Program). Available at: http://www.praat.org/ [retrieved June 28, 2009].
-
- Boucher J.-D., Ventre-Dominey J., Fagel S., Bailly G., Dominey P. F. (2010). “Facilitative effects of communicative gaze and speech in human-robot cooperation,” in ACM Workshop on Affective Interaction in Natural Environments (AFFINE), ed. Castellano G. (New York: ACM; ), 71–74
LinkOut - more resources
Full Text Sources
Other Literature Sources