Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2018 Apr 11;38(15):3657-3668.
doi: 10.1523/JNEUROSCI.2307-17.2018. Epub 2018 Mar 8.

Human V4 Activity Patterns Predict Behavioral Performance in Imagery of Object Color

Affiliations

Human V4 Activity Patterns Predict Behavioral Performance in Imagery of Object Color

Michael M Bannert et al. J Neurosci. .

Abstract

Color is special among basic visual features in that it can form a defining part of objects that are engrained in our memory. Whereas most neuroimaging research on human color vision has focused on responses related to external stimulation, the present study investigated how sensory-driven color vision is linked to subjective color perception induced by object imagery. We recorded fMRI activity in male and female volunteers during viewing of abstract color stimuli that were red, green, or yellow in half of the runs. In the other half we asked them to produce mental images of colored, meaningful objects (such as tomato, grapes, banana) corresponding to the same three color categories. Although physically presented color could be decoded from all retinotopically mapped visual areas, only hV4 allowed predicting colors of imagined objects when classifiers were trained on responses to physical colors. Importantly, only neural signal in hV4 was predictive of behavioral performance in the color judgment task on a trial-by-trial basis. The commonality between neural representations of sensory-driven and imagined object color and the behavioral link to neural representations in hV4 identifies area hV4 as a perceptual hub linking externally triggered color vision with color in self-generated object imagery.SIGNIFICANCE STATEMENT Humans experience color not only when visually exploring the outside world, but also in the absence of visual input, for example when remembering, dreaming, and during imagery. It is not known where neural codes for sensory-driven and internally generated hue converge. In the current study we evoked matching subjective color percepts, one driven by physically presented color stimuli, the other by internally generated color imagery. This allowed us to identify area hV4 as the only site where neural codes of corresponding subjective color perception converged regardless of its origin. Color codes in hV4 also predicted behavioral performance in an imagery task, suggesting it forms a perceptual hub for color perception.

Keywords: color vision; drift diffusion; fMRI; object imagery; pattern classification; reaction times.

PubMed Disclaimer

Figures

Figure 1.
Figure 1.
Stimulus material. Stimuli used in the imagery task. Each object belonged to one of three color categories (yellow, red, green). To reduce confounds unrelated to object color, objects were approximately matched in shape (elongated round, pile-shaped) and semantic associations (all of them were fruits/vegetables). Before scanning and before each imagery fMRI run, participants practiced to remember the images of the nine natural objects (see Materials and Methods). During each imagery block, they imagined one of the objects, and mentally compared its color to that of the subsequent object once the word-cue appeared (Fig. 2).
Figure 2.
Figure 2.
Experimental design. Trial sequence in real-color and imagery runs. In real-color runs (left) participants viewed concentric rings slowly drifting outward. The rings could be one of three colors: yellow, green, or red. Observers performed a detection task requiring them to press a button every time the luminance of the stimulus changed between high and low luminance. There could be 0, 1, or 2 target events per stimulus presentation (8.5 s, ITI = 1.5 s). In imagery runs (right), participants saw a word cue at the beginning of the trial for 1.5 s indicating which of the nine object images they had to imagine in the subsequent imagery phase (11.714 s). In addition to object imagery, participants had to perform a mental color comparison each time a new word cue appeared: they had to decide whether or not the color of the object they had to imagine in this trial matched the color of the object in the previous trial, and press one of two designated buttons accordingly (1-back same/different color judgment task).
Figure 3.
Figure 3.
Color decoding results. Real-color-to-real-color and real-color-to-imagined color decoding. A, Classifiers were trained to distinguish between the three color categories based on the responses in each ROI to the color stimuli. Classifier performance was cross-validated leaving out one of the six runs for testing on each iteration to obtain an average accuracy. In all ROIs classifiers could predict the color of the stimulus that participants were viewing significantly above chance. B, Color classifiers were trained on the whole set of fMRI responses to color stimuli to predict which color observers were seeing. The learned classifiers were then used to predict on a trial-by-trial basis the color of the objects that participants were imagining in the imagery runs of the experiment. Permutation tests showed that the color of the imagined objects could be decoded significantly above chance only from activity patterns in area hV4. A, B, Horizontal and vertical bars represent group means and two-tailed 95% confidence intervals, respectively. Chance level was 1/3. **p < 0.01, FWE corrected, one-tailed.
Figure 4.
Figure 4.
Shape decoding results. Classifiers were trained to distinguish between the three shapes (elongated round, pile-shaped) and tested on data from one of the six imagery runs that was excluded from the training procedure in a sixfold leave-one-run-out cross-validation scheme. The shape property of objects was orthogonal to the color feature dimension (Fig. 1). Mean decoding accuracies were significantly above chance in areas V2, V3, and LO2 according to permutation tests. Horizontal and vertical bars represent group means and two-tailed 95% confidence intervals, respectively. Chance level was 1/3. *p < 0.05, **p < 0.01, FWE corrected, one-tailed.
Figure 5.
Figure 5.
Brain-based drift diffusion modeling. HDDMs were used to study the relationship between the information content in ROIs and performance in the color judgment task. A, Behavioral data (RTs and errors) on trial j were modeled with different drift rates depending on whether the classifier predicted the correct or the incorrect label either for the response pattern yj-1 on the previous trial j-1 (pre-judgment model), which was recorded before the behavioral judgment, or the response pattern yj on the same trial j, which was recorded after the behavioral judgment (post-judgment model). As depicted on the right, higher drift rates mean faster response times and fewer errors. B, Posterior probability distribution over the difference vdiff between drift rates vc on correctly classified and vi on incorrectly classified trials for the two models in area hV4. The posterior probability of vc being larger than vi was 98.02% in the pre-judgment model, i.e., when different drift rates were assumed depending on the classification of the trial preceding the behavioral judgment. However, when instead classifier correctness for the imagery block following the behavioral judgment was taken into account (post-judgment model), the posterior probability dropped to 92.75%. See main text for values of all ROIs.

Similar articles

Cited by

References

    1. Albers AM, Kok P, Toni I, Dijkerman HC, de Lange FP (2013) Shared representations for working memory and mental imagery in early visual cortex. Curr Biol 23:1427–1431. 10.1016/j.cub.2013.05.065 - DOI - PubMed
    1. Albright TD. (2012) On the perception of probable things: neural substrates of associative memory, imagery, and perception. Neuron 74:227–245. 10.1016/j.neuron.2012.04.001 - DOI - PMC - PubMed
    1. Amano K, Shibata K, Kawato M, Sasaki Y, Watanabe T (2016) Learning to associate orientation with color in early visual areas by associative decoded fMRI neurofeedback. Curr Biol 26:1861–1866. 10.1016/j.cub.2016.05.014 - DOI - PMC - PubMed
    1. Bannert MM, Bartels A (2013) Decoding the yellow of a gray banana. Curr Biol 23:2268–2272. 10.1016/j.cub.2013.09.016 - DOI - PubMed
    1. Bannert MM, Bartels A (2017) Invariance of surface color representations across illuminant changes in the human cortex. Neuroimage 158:356–370. 10.1016/j.neuroimage.2017.06.079 - DOI - PubMed

Publication types

LinkOut - more resources