Contextual cueing: implicit learning and memory of visual context guides spatial attention
- PMID: 9679076
- DOI: 10.1006/cogp.1998.0681
Contextual cueing: implicit learning and memory of visual context guides spatial attention
Abstract
Global context plays an important, but poorly understood, role in visual tasks. This study demonstrates that a robust memory for visual context exists to guide spatial attention. Global context was operationalized as the spatial layout of objects in visual search displays. Half of the configurations were repeated across blocks throughout the entire session, and targets appeared within consistent locations in these arrays. Targets appearing in learned configurations were detected more quickly. This newly discovered form of search facilitation is termed contextual cueing. Contextual cueing is driven by incidentally learned associations between spatial configurations (context) and target locations. This benefit was obtained despite chance performance for recognizing the configurations, suggesting that the memory for context was implicit. The results show how implicit learning and memory of visual context can guide spatial attention towards task-relevant aspects of a scene.
Similar articles
-
The role of the basal ganglia in implicit contextual learning: a study of Parkinson's disease.Neuropsychologia. 2009 Apr;47(5):1269-73. doi: 10.1016/j.neuropsychologia.2009.01.008. Epub 2009 Jan 16. Neuropsychologia. 2009. PMID: 19428390
-
Cortical dynamics of contextually cued attentive visual learning and search: spatial and object evidence accumulation.Psychol Rev. 2010 Oct;117(4):1080-112. doi: 10.1037/a0020664. Psychol Rev. 2010. PMID: 21038974
-
Stimulus homogeneity enhances implicit learning: evidence from contextual cueing.Vision Res. 2014 Apr;97:108-16. doi: 10.1016/j.visres.2014.02.008. Epub 2014 Mar 3. Vision Res. 2014. PMID: 24603347
-
Working memory dependence of spatial contextual cueing for visual search.Br J Psychol. 2019 May;110(2):372-380. doi: 10.1111/bjop.12311. Epub 2018 May 10. Br J Psychol. 2019. PMID: 29745430 Review.
-
View-invariant object category learning, recognition, and search: how spatial and object attention are coordinated using surface-based attentional shrouds.Cogn Psychol. 2009 Feb;58(1):1-48. doi: 10.1016/j.cogpsych.2008.05.001. Epub 2008 Jul 23. Cogn Psychol. 2009. PMID: 18653176 Review.
Cited by
-
Visual search of experts in medical image reading: the effect of training, target prevalence, and expert knowledge.Front Psychol. 2013 Apr 5;4:166. doi: 10.3389/fpsyg.2013.00166. eCollection 2013. Front Psychol. 2013. PMID: 23576997 Free PMC article.
-
Contextual cuing as a form of nonconscious learning: Theoretical and empirical analysis in large and very large samples.Psychon Bull Rev. 2016 Dec;23(6):1996-2009. doi: 10.3758/s13423-016-1063-0. Psychon Bull Rev. 2016. PMID: 27220995
-
Automatic and intentional memory processes in visual search.Psychon Bull Rev. 2004 Oct;11(5):854-61. doi: 10.3758/bf03196712. Psychon Bull Rev. 2004. PMID: 15732694
-
Repeated Contextual Search Cues Lead to Reduced BOLD-Onset Times in Early Visual and Left Inferior Frontal Cortex.Open Neuroimag J. 2010 Apr 1;4:9-15. doi: 10.2174/1874440001004010009. Open Neuroimag J. 2010. PMID: 20563254 Free PMC article.
-
Contextually-Based Social Attention Diverges across Covert and Overt Measures.Vision (Basel). 2019 Jun 10;3(2):29. doi: 10.3390/vision3020029. Vision (Basel). 2019. PMID: 31735830 Free PMC article.
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources
Other Literature Sources
Medical