Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2009 May;21(3-4):157-67.
doi: 10.1007/s10548-009-0088-3. Epub 2009 Mar 27.

Not just for bimodal neurons anymore: the contribution of unimodal neurons to cortical multisensory processing

Affiliations

Not just for bimodal neurons anymore: the contribution of unimodal neurons to cortical multisensory processing

Brian L Allman et al. Brain Topogr. 2009 May.

Abstract

Traditionally, neuronal studies of multisensory processing proceeded by first identifying neurons that were overtly multisensory (e.g., bimodal, trimodal) and then testing them. In contrast, the present study examined, without precondition, neurons in an extrastriate visual area of the cat for their responses to separate (visual, auditory) and combined-modality (visual and auditory) stimulation. As expected, traditional bimodal forms of multisensory neurons were identified. In addition, however, many neurons that were activated only by visual stimulation (i.e., unimodal) had that response modulated by the presence of an auditory stimulus. Some unimodal neurons showed multisensory responses that were statistically different from their visual response. Other unimodal neurons had subtle multisensory effects that were detectable only at the population level. Most surprisingly, these non-bimodal neurons generated more than twice the multisensory signal in the PLLS than did the bimodal neurons. These results expand the range of multisensory convergence patterns beyond that of the bimodal neuron. However, rather than characterize a separate class of multisensory neurons, unimodal multisensory neurons may actually represent an intermediary form of multisensory convergence that exists along the functional continuum between unisensory neurons, at one end, and fully bimodal neurons at the other.

PubMed Disclaimer

Figures

Fig. 1
Fig. 1
A hypothetical schematic of sensory inputs that produce bimodal and unimodal sensory responses. The bimodal neuron (grey) receives inputs (black) at differing locations and number (or weighting) from sensory modalities ‘A’ and ‘B’. These inputs produce suprathreshold activity in either modality when stimulated alone. In addition, when inputs from ‘A + B’ are combined, responses can be integrated to the extent that they are significantly different from responses to either of the inputs alone. In contrast, the unimodal neuron receives inputs from only one modality (‘A’), and is unaffected by the presence of stimuli in modality (‘B’) alone or in combination with (‘A’)
Fig. 2
Fig. 2
Location of the PLLS and representative sensory/multisensory responses. In (a), the lateral view of the cat cerebral cortex shows the suprasylvian sulcus opened (grey) and the location of the PLLS (arrows). Some auditory cortical fields are also depicted. The vertical lines indicate the levels from which the lower, coronal sections are derived. On the coronal sections, the grey shaded area indicates the location of the PLLS; each dot represents the location of the neurons whose responses are depicted in parts (b) and (c). In (b), this bimodal PLLS neuron responded (raster = 25 trials; histogram = 10 ms time bins) to the presentation of a visual stimulus (ramp labeled ‘V’) as well as to an auditory stimulus (square wave labeled ‘A’). In the third panel, both visual and auditory stimuli are presented together (VA) and an even more vigorous response was recorded. The responses to the different stimulus conditions are summarized in the bar graph (far right; mean spikes/trial and standard deviation); dashed line = spontaneous activity level; ‘*’ statistical significant (P < 0.05, paired ‘t’-test. Part (c) illustrates the responses of a unimodal PLLS neuron. It was strongly activated by a visual stimulus, but an auditory stimulus had no effect either alone or in combination with the visual cue
Fig. 3
Fig. 3
Bimodal as well as unimodal PLLS neurons show multisensory effects. In (a) and (d), the graphs plot the visual response (V) versus the multisensory response change (visual response subtracted from combined visual-auditory response) for bimodal and unimodal PLLS neurons. For bimodal neurons (n = 49), combined (VA) responses were greater than the visual (V) responses in a large proportion of neurons, and multisensory responses tended to increase with increases in visual responsiveness, as shown in (a). These data are re-plotted in (b) to show that the proportion of response change (%) was overwhelmingly in the positive direction. The bar graph (c) shows that the average response of bimodal neurons was significantly (P < 0.05, paired ‘t’-test) increased from 7.0 ± 1.0 mean spikes/trial to 8.8 ± 1.3 mean spikes/trial when an auditory stimulus was also present. Surprisingly, these same trends were observed for the larger population of unimodal PLLS neurons (n = 233), which were activated robustly by visual stimulation but could not be activated by an auditory stimulus presented alone. In fact, the bar graph (f) indicates that the average response of unimodal neurons was significantly (P < 0.05, paired ‘t’-test) increased from 6.0 ± 0.4 mean spikes/trial to 6.9 ± 0.7 mean spikes/trial when visual was combined with auditory stimulation
Fig. 4
Fig. 4
Unimodal PLLS neurons show multisensory effects. The subset of unimodal neurons (37/233) that showed a statistically significant response modulation at the single neuron level during the combined visual-auditory condition (i.e., subthreshold multisensory neurons) all demonstrated an increased multisensory response change (a), with an average response change of 42 ± 3% (b). The bar graph (c) summarizes this effect, where the average response for the group of subthreshold multisensory neurons was significantly greater for the combined visual-auditory (VA; 11.4 ± 1.4 spikes/trial) condition than the visual (V) condition (8.7 ± 1.2, P < 0.05, paired ‘t’-test). The vast majority (81%, 159/196) of seemingly ‘unisensory’ neurons in the PLLS, despite failing to achieve the statistical criterion of multisensory processing at the neuronal level, demonstrated modest group response facilitation during the VA condition (hence, plotted above zero on the y-axis in d). For these unimodal neurons, the average response change was 11 ± 1% (e). Consequently, as seen in the bar graph (f), the population average mean spikes/trial for these neurons was significantly greater for the VA condition than the V condition (5.5 ± 0.4 vs. 6.0 ± 0.4, P < 0.05, paired ‘t’-test)
Fig. 5
Fig. 5
Hypothetical schematics of the patterns of sensory inputs that generate different forms of sensory/multisensory processing. The figures at the top represent the current model of multisensory processing. For a given neuron (grey colored body and dendrite), afferent axons and terminals (black) from one sensory modality (e.g., A) or another (B) provide excitatory drive via the priority (location) or weight (number) of their inputs. In the case of unimodal neurons (top right), inputs from modality ‘A’ generate suprathreshold responses to that modality while there is no effect of modality ‘B’ alone or in combination with ‘A.’ In contrast, for bimodal neurons (top-left), inputs from either modality ‘A’ and ‘B’ are sufficient to activate the neuron alone, and their combination produces an integration of the responses to ‘A’ and ‘B.’ A combined ‘A + B’ stimulus generates a integrated multisensory signal in bimodal neurons, but a distinct unisensory response from unimodal neurons. The results of the present investigation suggest that there is a range of convergence patterns that occur between the bimodal and unisensory extremes. By reducing the priority and/or weighting of inputs from modality ‘B,’ a neuron may loose its suprathreshold response to that modality while still being significantly affected by it when combined with inputs from modality ‘A.’ These forms of multisensory neurons have been termed ‘subthreshold’ (e.g., Dehner et al. 2004; Allman and Meredith 2007; Meredith and Allman 2009). In addition, further reduction of input priority/weighting of modality ‘B’ further reduces its effectiveness of influencing responses to modality ‘A,’ such that its effect can be seen at population levels or when local inhibitory circuits are pharmacologically blocked (Allman et al. 2008a). In this way, combination of modality ‘A + B’ produces a response within a mixed population of neurons whose levels of activity generate a smoothed, or continuous distribution between the two extremes. The pattern of multisensory convergence is not likely to be limited to these schematics; nor do these figures intend to represent any specific anatomical relationships

Similar articles

Cited by

References

    1. Allman BL, Meredith MA. Multisensory processing in ‘unimodal’ neurons: cross-modal subthreshold auditory effects in cat extrastriate visual cortex. J Neurophysiol. 2007;98:545–549. - PubMed
    1. Allman BL, Bittencourt-Navarrete RE, Keniston LP, Medina AE, Wang MY, Meredith MA. Do cross-modal projections always result in multisensory integration? Cereb Cortex. 2008a;18:2066–2076. - PMC - PubMed
    1. Allman BL, Keniston LP, Meredith MA. Subthreshold auditory inputs to extrastriate visual neurons are responsive to parametric changes in stimulus quality: sensory-specific versus non-specific coding. Brain Res. 2008b;1242:95–101. - PMC - PubMed
    1. Barraclough NE, Xiao D, Baker CI, Oram MW, Perrett DI. Integration of visual and auditory information by Superior Temporal Sulcus neurons responsive to the sight of actions. J Cognitive Neurosci. 2005;17:377–391. - PubMed
    1. Bizley JK, Nodal FR, Bajo VM, Nelken I, King AJ. Physiological and anatomical evidence for multisensory interactions in auditory cortex. Cereb Cortex. 2007;17:2172–2189. - PMC - PubMed

Publication types