Long-term adaptation to change in implicit contextual learning
- PMID: 24395095
- DOI: 10.3758/s13423-013-0568-z
Long-term adaptation to change in implicit contextual learning
Abstract
The visual world consists of spatial regularities that are acquired through experience in order to guide attentional orienting. For instance, in visual search, detection of a target is faster when a layout of nontarget items is encountered repeatedly, suggesting that learned contextual associations can guide attention (contextual cuing). However, scene layouts sometimes change, requiring observers to adapt previous memory representations. Here, we investigated the long-term dynamics of contextual adaptation after a permanent change of the target location. We observed fast and reliable learning of initial context-target associations after just three repetitions. However, adaptation of acquired contextual representations to relocated targets was slow and effortful, requiring 3 days of training with overall 80 repetitions. A final test 1 week later revealed equivalent effects of contextual cuing for both target locations, and these were comparable to the effects observed on day 1. That is, observers learned both initial target locations and relocated targets, given extensive training combined with extended periods of consolidation. Thus, while implicit contextual learning efficiently extracts statistical regularities of our environment at first, it is rather insensitive to change in the longer term, especially when subtle changes in context-target associations need to be acquired.
Similar articles
-
Automatic Guidance (and Misguidance) of Visuospatial Attention by Acquired Scene Memory: Evidence From an N1pc Polarity Reversal.Psychol Sci. 2020 Dec;31(12):1531-1543. doi: 10.1177/0956797620954815. Epub 2020 Oct 29. Psychol Sci. 2020. PMID: 33119432 Free PMC article.
-
Object-based implicit learning in visual search: perceptual segmentation constrains contextual cueing.J Vis. 2013 Jul 9;13(3):15. doi: 10.1167/13.3.15. J Vis. 2013. PMID: 23838562
-
Contextual remapping in visual search after predictable target-location changes.Psychol Res. 2011 Jul;75(4):279-89. doi: 10.1007/s00426-010-0306-3. Epub 2010 Aug 20. Psychol Res. 2011. PMID: 20725739
-
Working memory dependence of spatial contextual cueing for visual search.Br J Psychol. 2019 May;110(2):372-380. doi: 10.1111/bjop.12311. Epub 2018 May 10. Br J Psychol. 2019. PMID: 29745430 Review.
-
What to expect where and when: how statistical learning drives visual selection.Trends Cogn Sci. 2022 Oct;26(10):860-872. doi: 10.1016/j.tics.2022.06.001. Epub 2022 Jul 13. Trends Cogn Sci. 2022. PMID: 35840476 Review.
Cited by
-
Automatic Guidance (and Misguidance) of Visuospatial Attention by Acquired Scene Memory: Evidence From an N1pc Polarity Reversal.Psychol Sci. 2020 Dec;31(12):1531-1543. doi: 10.1177/0956797620954815. Epub 2020 Oct 29. Psychol Sci. 2020. PMID: 33119432 Free PMC article.
-
Global Repetition Influences Contextual Cueing.Front Psychol. 2018 Mar 27;9:402. doi: 10.3389/fpsyg.2018.00402. eCollection 2018. Front Psychol. 2018. PMID: 29636716 Free PMC article.
-
Crossmodal learning of target-context associations: When would tactile context predict visual search?Atten Percept Psychophys. 2020 May;82(4):1682-1694. doi: 10.3758/s13414-019-01907-0. Atten Percept Psychophys. 2020. PMID: 31845105 Free PMC article.
-
Position-invariant icon remapping facilities search performance in foldable smartphones through the contribution of contextual cueing.Cogn Res Princ Implic. 2025 Sep 1;10(1):57. doi: 10.1186/s41235-025-00668-9. Cogn Res Princ Implic. 2025. PMID: 40890492 Free PMC article.
-
Why Are Acquired Search-Guiding Context Memories Resistant to Updating?Front Psychol. 2021 Mar 1;12:650245. doi: 10.3389/fpsyg.2021.650245. eCollection 2021. Front Psychol. 2021. PMID: 33732200 Free PMC article.
References
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources
Other Literature Sources