Bayesian population decoding of spiking neurons
- PMID: 20011217
- PMCID: PMC2790948
- DOI: 10.3389/neuro.10.021.2009
Bayesian population decoding of spiking neurons
Abstract
The timing of action potentials in spiking neurons depends on the temporal dynamics of their inputs and contains information about temporal fluctuations in the stimulus. Leaky integrate-and-fire neurons constitute a popular class of encoding models, in which spike times depend directly on the temporal structure of the inputs. However, optimal decoding rules for these models have only been studied explicitly in the noiseless case. Here, we study decoding rules for probabilistic inference of a continuous stimulus from the spike times of a population of leaky integrate-and-fire neurons with threshold noise. We derive three algorithms for approximating the posterior distribution over stimuli as a function of the observed spike trains. In addition to a reconstruction of the stimulus we thus obtain an estimate of the uncertainty as well. Furthermore, we derive a 'spike-by-spike' online decoding scheme that recursively updates the posterior with the arrival of each new spike. We use these decoding rules to reconstruct time-varying stimuli represented by a Gaussian process from spike trains of single neurons as well as neural populations.
Keywords: Bayesian decoding; approximate inference; population coding; spiking neurons.
Figures













Similar articles
-
Reconstructing stimuli from the spike times of leaky integrate and fire neurons.Front Neurosci. 2011 Feb 23;5:1. doi: 10.3389/fnins.2011.00001. eCollection 2011. Front Neurosci. 2011. PMID: 21390287 Free PMC article.
-
Prediction and decoding of retinal ganglion cell responses with a probabilistic spiking model.J Neurosci. 2005 Nov 23;25(47):11003-13. doi: 10.1523/JNEUROSCI.3305-05.2005. J Neurosci. 2005. PMID: 16306413 Free PMC article.
-
Neural decoding with visual attention using sequential Monte Carlo for leaky integrate-and-fire neurons.PLoS One. 2019 May 14;14(5):e0216322. doi: 10.1371/journal.pone.0216322. eCollection 2019. PLoS One. 2019. PMID: 31086375 Free PMC article.
-
Statistical models for neural encoding, decoding, and optimal stimulus design.Prog Brain Res. 2007;165:493-507. doi: 10.1016/S0079-6123(06)65031-0. Prog Brain Res. 2007. PMID: 17925266 Review.
-
Reading spike timing without a clock: intrinsic decoding of spike trains.Philos Trans R Soc Lond B Biol Sci. 2014 Jan 20;369(1637):20120467. doi: 10.1098/rstb.2012.0467. Print 2014 Mar 5. Philos Trans R Soc Lond B Biol Sci. 2014. PMID: 24446501 Free PMC article. Review.
Cited by
-
Dissociated sequential activity and stimulus encoding in the dorsomedial striatum during spatial working memory.Elife. 2016 Sep 16;5:e19507. doi: 10.7554/eLife.19507. Elife. 2016. PMID: 27636864 Free PMC article.
-
Efficient Markov chain Monte Carlo methods for decoding neural spike trains.Neural Comput. 2011 Jan;23(1):46-96. doi: 10.1162/NECO_a_00059. Epub 2010 Oct 21. Neural Comput. 2011. PMID: 20964539 Free PMC article.
-
Sparse decoding of multiple spike trains for brain-machine interfaces.J Neural Eng. 2012 Oct;9(5):054001. doi: 10.1088/1741-2560/9/5/054001. Epub 2012 Sep 6. J Neural Eng. 2012. PMID: 22954906 Free PMC article. Clinical Trial.
-
Extraction of Network Topology From Multi-Electrode Recordings: Is there a Small-World Effect?Front Comput Neurosci. 2011 Jan 7;5:4. doi: 10.3389/fncom.2011.00004. eCollection 2011. Front Comput Neurosci. 2011. PMID: 21344015 Free PMC article.
-
An in silico model for determining the influence of neuronal co-activity on rodent spatial behavior.J Neurosci Methods. 2022 Jul 15;377:109627. doi: 10.1016/j.jneumeth.2022.109627. Epub 2022 May 21. J Neurosci Methods. 2022. PMID: 35609789 Free PMC article.
References
-
- Bishop C. (2006). Pattern Recognition and Machine Learning. New York, Springer; 10.1007/978-0-387-45528-0 - DOI
-
- Cunningham J., Shenoy K., Sahani M. (2008). Fast Gaussian process methods for point process intensity estimation. In Proceedings of the 25th International Conference on Machine Learning, New York, ACM, pp. 192–199
LinkOut - more resources
Full Text Sources