Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2016 Oct 25:3:160092.
doi: 10.1038/sdata.2016.92.

A studyforrest extension, simultaneous fMRI and eye gaze recordings during prolonged natural stimulation

Affiliations

A studyforrest extension, simultaneous fMRI and eye gaze recordings during prolonged natural stimulation

Michael Hanke et al. Sci Data. .

Abstract

Here we present an update of the studyforrest (http://studyforrest.org) dataset that complements the previously released functional magnetic resonance imaging (fMRI) data for natural language processing with a new two-hour 3 Tesla fMRI acquisition while 15 of the original participants were shown an audio-visual version of the stimulus motion picture. We demonstrate with two validation analyses that these new data support modeling specific properties of the complex natural stimulus, as well as a substantial within-subject BOLD response congruency in brain areas related to the processing of auditory inputs, speech, and narrative when compared to the existing fMRI data for audio-only stimulation. In addition, we provide participants' eye gaze location as recorded simultaneously with fMRI, and an additional sample of 15 control participants whose eye gaze trajectories for the entire movie were recorded in a lab setting-to enable studies on attentional processes and comparative investigations on the potential impact of the stimulation setting on these processes.

PubMed Disclaimer

Conflict of interest statement

The authors declare no competing financial interests.

Figures

Figure 1
Figure 1. Summary statistics for head movement estimates across movie segments and participants.
These estimates indicate relative motion with respect to template brain volume computed for each participants across all scans. The area shaded in light-gray depicts the range across participants, the medium-gray area indicates the 50% percentile around the mean, and the dark-gray area shows±one s.e. of the mean. The black line indicates the median estimate. Dashed vertical lines indicate run boundaries where participants had a brief break. The red lines indicate motion estimate time series of outlier participants. An outlier was defined as a participant whose motion estimate exceeded a distance of three s.d.s from the mean across participants for at least one fMRI volume in a run and exceeded a maximum translation of 1.5 mm or a rotation of 1.5°. For a breakdown of detected outliers, see Table 1.
Figure 2
Figure 2. Mass-univariate regression group analysis of specific properties of portrayed emotion in the movie stimulus.
Contrast 1: other-directed>self-directed emotion (red-yellow) lead to engagement of a large network including large parts of the dorsal medial cingulate gyrus, superior frontal gyrus, and precuneus, the latter being implicated in a variety of social cognition functions. Contrast 2–3: BOLD response correlations with portrayed emotions for auditory>non-auditory cues (purple) reveal primary auditory cortex and large portions of superior temporal gyri. In contrast, visual>non-visual cues (cyan) lead to pronounced activity patterns in occipital lobe comprising primary visual cortex and visual association areas. All coordinates are in MNI-space.
Figure 3
Figure 3. Average voxelwise within-subject correlation of BOLD time series between the previous audio-only movie experiment and the present audio-visual movie experiment.
Correlations were Fisher-transformed to Z-scores and averaged across movie segments (n=8) and participants (n=14). The legend shows the equivalent group average correlation range for comparison. As expected, significant correlations are observed in brain areas associated with speech and story processing, but also in occipito-temporal cortex despite the lack of relevant visual stimulation in the audio-only experiment. Results are shown on the reconstructed surface of the MNI152 brain template.
Figure 4
Figure 4. Technical validation of eye gaze recordings from the in-lab (purple curves) and the in-scanner (blue curves) sample.
(a) Timing precision of movie frame presentation. The histogram shows the fraction of frame's display durations across all participants and movie segments. (b) Fraction of signal loss per participants. The box plot show the distribution of signal loss across all movie segments, sorted by median loss. Sample association for individual participants is color-coded on the X-axis. (c) A one minute time series excerpt of the gaze diversity scores computed across all participants. The gray dotted vertical lines show the position of movie cuts. The horizontal blue line represents the 95th percentile for the full movie; green is for the 50th percentile, and the yellow for the 5th percentile. The three exemplary movie frames at the top show representatives of each chosen percentile containing the empirical gaze locations (red), and the brightness modulation on top of the movie frame content depicts the result of the Gaussian smoothing of the computed gaze distribution heat map. (d) The mean and s.e. of all diversity score segments in temporal vicinity of a cut, separately computed for the in-scanner and in-lab participants (Y-axis conceptually similar, but not identical, to panel c). Participants 5, 10, 20, and 36 were excluded from the analyses presented in panels c and d due to low spatial accuracy of the eye gaze recordings (see Table 1).

Similar articles

Cited by

References

Data Citations

    1. Kay K., Naselaris T., Gallant J. L. 2011. CRCNS.org. http://dx.doi.org/10.6080/K0QN64NGx - DOI
    1. Mannion D. J. 2015. CRCNS.org. http://dx.doi.org/10.6080/K0JS9NC2 - DOI
    1. Nishimoto S. 2015. CRCNS.org. http://dx.doi.org/10.6080/K00Z715X - DOI
    1. Aminoff E. M., Tarr M. J. 2015. OpenfMRI. ds000149
    1. Hanke M. 2016. OpenfMRI. ds000113d

References

    1. Hasson U. & Honey C. J. Future trends in neuroimaging: Neural processes as expressed within real-life contexts. NeuroImage 62, 1272–1278 (2012). - PMC - PubMed
    1. Hanke M. et al. A high-resolution 7-Tesla fMRI dataset from complex natural stimulation with an audio movie. Scientific Data 1, 140003 (2014). - PMC - PubMed
    1. Labs A. et al. Portrayed emotions in the movie ‘Forrest Gump’. F1000Research 4, 92 (2015). - PMC - PubMed
    1. Hanke M. et al. High-resolution 7-Tesla fMRI data on the perception of musical genres—an extension to the studyforrest dataset. F1000Research 4, 174 (2015).
    1. Chen P.-H. C. et al. A reduced-dimension fMRI shared response model. In Advances in Neural Information Processing Systems 460–468 (2015).

Publication types