Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2022 Mar 16;42(11):2344-2355.
doi: 10.1523/JNEUROSCI.0861-21.2022. Epub 2022 Jan 28.

Neural Encoding of Active Multi-Sensing Enhances Perceptual Decision-Making via a Synergistic Cross-Modal Interaction

Affiliations

Neural Encoding of Active Multi-Sensing Enhances Perceptual Decision-Making via a Synergistic Cross-Modal Interaction

Ioannis Delis et al. J Neurosci. .

Abstract

Most perceptual decisions rely on the active acquisition of evidence from the environment involving stimulation from multiple senses. However, our understanding of the neural mechanisms underlying this process is limited. Crucially, it remains elusive how different sensory representations interact in the formation of perceptual decisions. To answer these questions, we used an active sensing paradigm coupled with neuroimaging, multivariate analysis, and computational modeling to probe how the human brain processes multisensory information to make perceptual judgments. Participants of both sexes actively sensed to discriminate two texture stimuli using visual (V) or haptic (H) information or the two sensory cues together (VH). Crucially, information acquisition was under the participants' control, who could choose where to sample information from and for how long on each trial. To understand the neural underpinnings of this process, we first characterized where and when active sensory experience (movement patterns) is encoded in human brain activity (EEG) in the three sensory conditions. Then, to offer a neurocomputational account of active multisensory decision formation, we used these neural representations of active sensing to inform a drift diffusion model of decision-making behavior. This revealed a multisensory enhancement of the neural representation of active sensing, which led to faster and more accurate multisensory decisions. We then dissected the interactions between the V, H, and VH representations using a novel information-theoretic methodology. Ultimately, we identified a synergistic neural interaction between the two unisensory (V, H) representations over contralateral somatosensory and motor locations that predicted multisensory (VH) decision-making performance.SIGNIFICANCE STATEMENT In real-world settings, perceptual decisions are made during active behaviors, such as crossing the road on a rainy night, and include information from different senses (e.g., car lights, slippery ground). Critically, it remains largely unknown how sensory evidence is combined and translated into perceptual decisions in such active scenarios. Here we address this knowledge gap. First, we show that the simultaneous exploration of information across senses (multi-sensing) enhances the neural encoding of active sensing movements. Second, the neural representation of active sensing modulates the evidence available for decision; and importantly, multi-sensing yields faster evidence accumulation. Finally, we identify a cross-modal interaction in the human brain that correlates with multisensory performance, constituting a putative neural mechanism for forging active multisensory perception.

Keywords: EEG; active sensing; drift diffusion model; multisensory processing; partial information decomposition; perceptual decision-making.

PubMed Disclaimer

Figures

Figure 1.
Figure 1.
Experimental design and behavioral results. A, The Pantograph is a haptic device used to render virtual surfaces that can be actively sensed. Top, The parts of the Pantograph shown from a lateral view. Participants placed their index finger on the interface plate. Bottom, The Pantograph device used in this experiment. B, The stimulus in the three sensory conditions. We programmed the Pantograph to generate a virtual grating texture. The workspace was split into two subspaces (L and R) that differed in the amplitude of the virtual surface that the participants actively sensed. One of the two sides (randomly assigned) had the reference amplitude (equal to 1), and the other had the comparison amplitude that varied on each trial taking one of the values: 0.5, 0.75, 0.9, 1.1, 1.25, and 1.5. Participants performed the task using V, H, or VH. Amplitude of the stimulus in the haptic domain (H) was translated as contrast in the visual domain (V). Crucially, to match the H condition, only a moving dot following the participant's finger was revealed on the screen in V. C, Index finger trajectory indicating the scanning pattern of the virtual texture in one trial. On this trial, the participant actively sensed the left subspace first, then moved to the right subspace and explored it before coming back to the left subspace again and reporting their choice. D, Psychometric curves indicating the percentage of nonreference choices for all three sensory conditions (blue represents V; green represents H; red represents VH) and for all stimulus differences. Large dots represent average percentage of choices across participants. Smaller dots represent individual participant means. Data are fit using cumulative Gaussian functions. E, Cumulative distributions (CDF) of RTs for all three sensory conditions (blue represents V; green represents H; red represents VH) across all trials of all participants. Thick lines indicate CDFs across all participant data. Thin lines indicate individual participant CDFs for each sensory condition.
Figure 2.
Figure 2.
Results of velocity reconstruction analysis using EEG signals. A, Scalp topographies of the forward models representing neural encoding of instantaneous finger velocity for the three sensory conditions. The presented scalp maps show velocity-encoding EEG signals averaged over the following time windows: 20 and 120 ms lags between velocity and EEG for V and VH, and 60 and 160 ms lags for H. B, Accuracy of the velocity reconstruction from the EEG signals measured using the squared correlation coefficient (r2) between the original and the approximated velocity profile in the three sensory conditions (blue represents V; green represents H; red represents VH). Bars represent means across participants. Error bars indicate SEM. Dots represent individual participant data. C, D, Temporal response functions (TRFs) of the velocity-encoding EEG activity in the three sensory conditions (blue represents V; green represents H; red represents VH) averaged over frontal electrodes (in C) and over occipital electrodes (in D).
Figure 3.
Figure 3.
Informed modeling of decision-making behavior. A, Comparison of the best-fitting model (with r2 as a regressor of drift rate δ only and ncr, tlow as regressors of nondecision time τ only) with alternate models using the DIC. Positive ΔDIC (DICmodel – DICoptimal) values for all six models indicate that the model of choice achieved a better trade-off between goodness of fit and number of free parameters. B, Graphical representation showing hierarchical estimation of HDDM parameters. Round nodes represent continuous random variables. Double-bordered nodes represent variables defined in terms of other variables. Shaded nodes represent recorded or computed signals, that is, single-trial behavioral data (accuracy, RT, and stimulus differences, s), EEG-velocity couplings (r2), and kinematic parameters (ncr, tlow). Parameters are modeled as Gaussian random variables with inferred means μ and variances σ2. Plates denote that multiple random variables share the same parents and children. The outer plate is over sensory conditions (V, H, VH), and the inner plate is over all trials (K) and participants (N). C, Behavioral RT distributions are shown as histograms for each sensory condition (blue represents V; green represents H; red represents VH) for correct (right) and incorrect (left) trials together with the HDDM fits (black lines). Higher histogram values on the right indicate higher proportion of correct choices. D, Posterior distributions of regression coefficients (γ1) of the EEG-velocity couplings (r2), as predictors of the drift rate (δ) of the HDDM shown in A. The three colored curves indicate posterior distributions for the three sensory conditions (blue represents V; green represents H; red represents VH). E, Posterior distributions of decision boundaries for the three sensory conditions (blue represents V; green represents H; red represents VH). F, Cross-participant correlation of differences in choice accuracy (ΔAcc, x axis) and differences in β1 (Δβ1, y axis) between the multisensory (VH) and the two unisensory (V, H) conditions (yellow represents VH-V; purple represents VH-H). G, Posterior distributions of regression coefficients (βsw) of the number of crossings between L and R (ncr), as predictor of nondecision time (τ) of the HDDM shown in A. H, Posterior distributions of regression coefficients (βexp) of the time spent on the low-amplitude stimulus (tlow), as predictor of nondecision time (τ) of the HDDM shown in A. I, Cross-participant correlation of average RTs across trials and sensory conditions (x axis) and βexp (y axis).
Figure 4.
Figure 4.
Neural representations and cross-modal interactions. A, Results of PID applied to predict the multisensory (VH) model of active sensing from the two unisensory (V and H) models. Dots on the scalp topographies indicate the EEG channels that provide significant (p < 0.01, FDR-corrected) visual unique (top left), haptic unique (top right), redundant (bottom left), and synergistic (bottom right) neural information, respectively. B, Across-subject correlation between synergy in the two significant EEG channels (red represents CP3; blue represents C5) and choice accuracy in the VH condition.

Similar articles

Cited by

References

    1. Aller M, Noppeney U (2019) To integrate or not to integrate: temporal dynamics of hierarchical Bayesian causal inference. PLoS Biol 17:e3000210. 10.1371/journal.pbio.3000210 - DOI - PMC - PubMed
    1. Angelaki DE, Gu Y, DeAngelis GC (2009) Multisensory integration: psychophysics, neurophysiology, and computation. Curr Opin Neurobiol 19:452–458. 10.1016/j.conb.2009.06.008 - DOI - PMC - PubMed
    1. Bell AJ, Sejnowski TJ (1995) An information-maximization approach to blind separation and blind deconvolution. Neural Comput 7:1129–1159. 10.1162/neco.1995.7.6.1129 - DOI - PubMed
    1. Bizley JK, Jones GP, Town SM (2016) Where are multisensory signals combined for perceptual decision-making? Curr Opin Neurobiol 40:31–37. 10.1016/j.conb.2016.06.003 - DOI - PubMed
    1. Boehm U, Marsman M, Matzke D, Wagenmakers EJ (2018) On the importance of avoiding shortcuts in applying cognitive models to hierarchical data. Behav Res Methods 50:1614–1631. 10.3758/s13428-018-1054-3 - DOI - PMC - PubMed

Publication types