Look together: analyzing gaze coordination with epistemic network analysis
- PMID: 26257677
- PMCID: PMC4508484
- DOI: 10.3389/fpsyg.2015.01016
Look together: analyzing gaze coordination with epistemic network analysis
Abstract
When conversing and collaborating in everyday situations, people naturally and interactively align their behaviors with each other across various communication channels, including speech, gesture, posture, and gaze. Having access to a partner's referential gaze behavior has been shown to be particularly important in achieving collaborative outcomes, but the process in which people's gaze behaviors unfold over the course of an interaction and become tightly coordinated is not well understood. In this paper, we present work to develop a deeper and more nuanced understanding of coordinated referential gaze in collaborating dyads. We recruited 13 dyads to participate in a collaborative sandwich-making task and used dual mobile eye tracking to synchronously record each participant's gaze behavior. We used a relatively new analysis technique-epistemic network analysis-to jointly model the gaze behaviors of both conversational participants. In this analysis, network nodes represent gaze targets for each participant, and edge strengths convey the likelihood of simultaneous gaze to the connected target nodes during a given time-slice. We divided collaborative task sequences into discrete phases to examine how the networks of shared gaze evolved over longer time windows. We conducted three separate analyses of the data to reveal (1) properties and patterns of how gaze coordination unfolds throughout an interaction sequence, (2) optimal time lags of gaze alignment within a dyad at different phases of the interaction, and (3) differences in gaze coordination patterns for interaction sequences that lead to breakdowns and repairs. In addition to contributing to the growing body of knowledge on the coordination of gaze behaviors in joint activities, this work has implications for the design of future technologies that engage in situated interactions with human users.
Keywords: conversational repair; epistemic network analysis; gaze tracking; referential gaze; social signals.
Figures
References
-
- Allopenna P. D., Magnuson J. S., Tanenhaus M. K. (1998). Tracking the time course of spoken word recognition using eye movements: evidence for continuous mapping models. J. Mem. Lang. 38, 419–439. 10.1006/jmla.1997.2558 - DOI
-
- Altmann G. T., Kamide Y. (2004). Now you see it, now you don't: mediating the mapping between language and the visual world, in The Interface of Language, Vision, and Action: Eye Movements and the Visual World, eds Henderson J. M., Ferreira F. (New York, NY: Psychology Press; ), 347–386.
-
- Argyle M., Cook M. (1976). Gaze and Mutual Gaze. Cambridge, UK: Cambridge University Press.
-
- Baldwin D. A. (1995). Understanding the link between joint attention and language, in Joint Attention: Its Origins and Role in Development, eds Moore C., Dunham P. J. (Hillsdale, NJ: Erlbaum; ), 131–158.
-
- Bard E. G., Hill R., Arai M. (2009). Referring and gaze alignment: accessibility is alive and well in situated dialogue, in Proceedings of CogSci 2009 (Amsterdam: Cognitive Science Society; ), 1246–1251.
LinkOut - more resources
Full Text Sources
Other Literature Sources
