Computational principles and models of multisensory integration
- PMID: 27918886
- PMCID: PMC5447489
- DOI: 10.1016/j.conb.2016.11.002
Computational principles and models of multisensory integration
Abstract
Combining information from multiple senses creates robust percepts, speeds up responses, enhances learning, and improves detection, discrimination, and recognition. In this review, I discuss computational models and principles that provide insight into how this process of multisensory integration occurs at the behavioral and neural level. My initial focus is on drift-diffusion and Bayesian models that can predict behavior in multisensory contexts. I then highlight how recent neurophysiological and perturbation experiments provide evidence for a distributed redundant network for multisensory integration. I also emphasize studies which show that task-relevant variables in multisensory contexts are distributed in heterogeneous neural populations. Finally, I describe dimensionality reduction methods and recurrent neural network models that may help decipher heterogeneous neural populations involved in multisensory integration.
Copyright © 2016 Elsevier Ltd. All rights reserved.
Conflict of interest statement
Nothing declared
Figures


References
-
- Ernst MO, Bulthoff HH. Merging the senses into a robust percept. Trends in Cognitive Sciences. 2004;8:162–169. - PubMed
-
- Spence C. Multisensory Flavor Perception. Cell. 2015;161:24–35. - PubMed
-
- Alais D, Newell FN, Mamassian P. Multisensory processing in review: from physiology to behaviour. Seeing Perceiving. 2010;23:3–38. - PubMed
Publication types
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources
Other Literature Sources