Distributed attention beats the down-side of statistical context learning in visual search
- PMID: 38755793
- PMCID: PMC7424102
- DOI: 10.1167/jov.20.7.4
Distributed attention beats the down-side of statistical context learning in visual search
Abstract
Spatial attention can be deployed with a narrower focus to process individual items or distributed relatively broadly to process larger parts of a scene. This study investigated how focused- versus distributed-attention modes contribute to the adaptation of context-based memories that guide visual search. In two experiments, participants were either required to fixate the screen center and use peripheral vision for search ("distributed attention"), or they could freely move their eyes, enabling serial scanning of the search array ("focused attention"). Both experiments consisted of an initial learning phase and a subsequent test phase. During learning, participants searched for targets presented either among repeated (invariant) or nonrepeated (randomly generated) spatial layouts of distractor items. Prior research showed that repeated encounters of invariant display arrangements lead to long-term context memory about these arrays, which can then come to guide search (contextual-cueing effect). The crucial manipulation in the test phase was a change of the target location within an otherwise constant distractor layout, which has previously been shown to abolish the cueing effect. The current results replicated these findings, although importantly only when attention was focused. By contrast, with distributed attention, the cueing effect recovered rapidly and attained a level comparable to the initial effect (before the target location change). This indicates that contextual cueing can adapt more easily when attention is distributed, likely because a broad attentional set facilitates the flexible updating of global (distractor-distractor), as compared to more local (distractor-target), context representations-allowing local changes to be incorporated more readily.
Figures




Similar articles
-
Taking Attention Out of Context: Frontopolar Transcranial Magnetic Stimulation Abolishes the Formation of New Context Memories in Visual Search.J Cogn Neurosci. 2019 Mar;31(3):442-452. doi: 10.1162/jocn_a_01358. Epub 2018 Nov 20. J Cogn Neurosci. 2019. PMID: 30457915 Clinical Trial.
-
Central and peripheral vision loss differentially affects contextual cueing in visual search.J Exp Psychol Learn Mem Cogn. 2015 Sep;41(5):1485-96. doi: 10.1037/xlm0000117. Epub 2015 Apr 13. J Exp Psychol Learn Mem Cogn. 2015. PMID: 25867615 Clinical Trial.
-
Stimulus-driven updating of long-term context memories in visual search.Psychol Res. 2022 Feb;86(1):252-267. doi: 10.1007/s00426-021-01474-w. Epub 2021 Jan 26. Psychol Res. 2022. PMID: 33496847 Free PMC article.
-
Contextual facilitation: Separable roles of contextual guidance and context suppression in visual search.Psychon Bull Rev. 2024 Dec;31(6):2672-2680. doi: 10.3758/s13423-024-02508-1. Epub 2024 Apr 30. Psychon Bull Rev. 2024. PMID: 38689187 Free PMC article.
-
Working memory dependence of spatial contextual cueing for visual search.Br J Psychol. 2019 May;110(2):372-380. doi: 10.1111/bjop.12311. Epub 2018 May 10. Br J Psychol. 2019. PMID: 29745430 Review.
Cited by
-
Why Are Acquired Search-Guiding Context Memories Resistant to Updating?Front Psychol. 2021 Mar 1;12:650245. doi: 10.3389/fpsyg.2021.650245. eCollection 2021. Front Psychol. 2021. PMID: 33732200 Free PMC article.
-
Contextual Cueing Accelerated and Enhanced by Monetary Reward: Evidence From Event-Related Brain Potentials.Front Hum Neurosci. 2021 Apr 15;15:623931. doi: 10.3389/fnhum.2021.623931. eCollection 2021. Front Hum Neurosci. 2021. PMID: 33935668 Free PMC article.
-
Contextual Cueing Effect Under Rapid Presentation.Front Psychol. 2020 Dec 16;11:603520. doi: 10.3389/fpsyg.2020.603520. eCollection 2020. Front Psychol. 2020. PMID: 33424716 Free PMC article.
-
Task-based memory systems in contextual-cueing of visual search and explicit recognition.Sci Rep. 2020 Oct 5;10(1):16527. doi: 10.1038/s41598-020-71632-4. Sci Rep. 2020. PMID: 33020507 Free PMC article.
-
Template-based attentional guidance and generic procedural learning in contextual guided visual search: Evidence from reduced response time variability.J Vis. 2025 Apr 1;25(4):1. doi: 10.1167/jov.25.4.1. J Vis. 2025. PMID: 40168157 Free PMC article.
References
-
- Annac E., Manginelli A. A., Pollmann S., Shi Z., Müller H. J., & Geyer T. (2013). Memory under pressure: Secondary-task effects on contextual cueing of visual search. Journal of Vision, 13(13), 6. - PubMed
-
- Annac E., Conci M., Müller H.J., & Geyer T. (2017). Local target context modulates adaptation of learned contextual cues. Visual Cognition, 25, 262–277.
-
- Beesley T., Vadillo M. A., Pearson D., & Shanks D. R. (2015). Pre-exposure of repeated search configurations facilitates subsequent contextual cuing of visual search. Journal of Experimental Psychology: Learning, Memory, and Cognition, 41(2), 348–362. - PubMed
LinkOut - more resources
Full Text Sources