Task-related gaze control in human crowd navigation
- PMID: 31993979
- PMCID: PMC7343766
- DOI: 10.3758/s13414-019-01952-9
Task-related gaze control in human crowd navigation
Abstract
Human crowds provide an interesting case for research on the perception of people. In this study, we investigate how visual information is acquired for (1) navigating human crowds and (2) seeking out social affordances in crowds by studying gaze behavior during human crowd navigation under different task instructions. Observers (n = 11) wore head-mounted eye-tracking glasses and walked two rounds through hallways containing walking crowds (n = 38) and static objects. For round one, observers were instructed to avoid collisions. For round two, observers furthermore had to indicate with a button press whether oncoming people made eye contact. Task performance (walking speed, absence of collisions) was similar across rounds. Fixation durations indicated that heads, bodies, objects, and walls maintained gaze comparably long. Only crowds in the distance maintained gaze relatively longer. We find no compelling evidence that human bodies and heads hold one's gaze more than objects while navigating crowds. When eye contact was assessed, heads were fixated more often and for a total longer duration, which came at the cost of looking at bodies. We conclude that gaze behavior in crowd navigation is task-dependent, and that not every fixation is strictly necessary for navigating crowds. When explicitly tasked with seeking out potential social affordances, gaze is modulated as a result. We discuss our findings in the light of current theories and models of gaze behavior. Furthermore, we show that in a head-mounted eye-tracking study, a large degree of experimental control can be maintained while many degrees of freedom on the side of the observer remain.
Keywords: Gaze; Human crowds; Social affordances; Social interaction; Task; Wearable eye tracking.
Figures











References
-
- Argyle, M. (1972). Eye-contact and the direction of gaze. In The psychology of interpersonal behavior. Harmondsworth: Penguin Books.
-
- Atkinson, A.P., Heberlein, A.S., & Adolphs, R. (2011). Are people special? A brain’s eye view. In R.B. Adams, N. Ambady, K. Nakayama, & S. Shimojo (Eds.) The science of social vision. Oxford: Oxford University Press.
-
- Benjamins, J.S., Hessels, R.S., & Hooge, I.T.C. (2018). GazeCode: Open-source software for manual mapping of mobile eye-tracking data. In ETRA ’18: 2018 symposium on eye tracking research and applications (pp. 1–4). New York: ACM Press.
-
- Berton, F., Hoyet, L., Olivier, A.-H., & Pettré, J. (2018). Gaze behaviour during collision avoidance between walkers: A preliminary study to design an experimental platform. VH-CIE@IEEEVR - IEEE virtual humans and crowds for immersive environments, the IEEE conference on virtual reality, pp. 1–5.
-
- Bindemann M, Burton AM, Hooge ITC, Jenkins R, de Haan EHF. Faces retain attention. Psychological Bulletin & Review. 2005;12(6):1048–1053. - PubMed
MeSH terms
LinkOut - more resources
Full Text Sources