Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2009 Feb 11:3:4.
doi: 10.3389/neuro.11.004.2009. eCollection 2009.

Python for information theoretic analysis of neural data

Affiliations

Python for information theoretic analysis of neural data

Robin A A Ince et al. Front Neuroinform. .

Abstract

Information theory, the mathematical theory of communication in the presence of noise, is playing an increasingly important role in modern quantitative neuroscience. It makes it possible to treat neural systems as stochastic communication channels and gain valuable, quantitative insights into their sensory coding function. These techniques provide results on how neurons encode stimuli in a way which is independent of any specific assumptions on which part of the neuronal response is signal and which is noise, and they can be usefully applied even to highly non-linear systems where traditional techniques fail. In this article, we describe our work and experiences using Python for information theoretic analysis. We outline some of the algorithmic, statistical and numerical challenges in the computation of information theoretic quantities from neural data. In particular, we consider the problems arising from limited sampling bias and from calculation of maximum entropy distributions in the presence of constraints representing the effects of different orders of interaction in the system. We explain how and why using Python has allowed us to significantly improve the speed and domain of applicability of the information theoretic algorithms, allowing analysis of data sets characterized by larger numbers of variables. We also discuss how our use of Python is facilitating integration with collaborative databases and centralised computational resources.

Keywords: Python; bias; e-science; entropy; information theory; maximum entropy; neural coding.

PubMed Disclaimer

Figures

Figure 1
Figure 1
Comparison of the performance of different bias correction methods. The methods were applied to spike trains of eight simulated somatosensory cortical neurons (see text). The information estimates I(S; R) and Ish(S; R) are plotted as a function of the available number of trials per stimulus. (A) Mean ± SD/2 (over 50 simulations) of I(S; R). (B) Mean ± SD/2 (over 50 simulations) of Ish(S; R). This calculation is very similar to that in Panzeri et al. (, Figure 3), which also used realistic simulations of cortical spike trains (the only difference was that for this figure, the simulated population did not contain any correlations). This figure was produced using the Python library for bias corrections described in Section “A Python Library for Information Theoretic Estimates”, and the code to produce it is available at http://code.google.com/p/pyentropy/.
Figure 2
Figure 2
Responses of a VPm neuron to white noise vibrissa stimulation. (A) Vibrissa position as a function of time in units of stimulus SD (1 SD = 70 μm). (B) Spikes fired by the neuron in response to 70 repetitions of the stimulus shown in (A).
Figure 3
Figure 3
Response entropy of a VPm neuron to white noise vibrassa stimulation. The full response entropy [H(R) denoted H in the figure] is shown together with that of maximum entropy models preserving first [H(1)], first and second [H(2)] and up to third order [H(3)] marginal densities. The response is treated as non-overlapping words of length 6 (panel A), 10 (panel B) and 14 (panel C) bins, with each bin of 4 ms duration.

References

    1. Amari S. I. (2001). Information geometry on hierarchy of probability distributions. IEEE Trans. Inf. Theory 47, 1701–1711 10.1109/18.930911 - DOI
    1. Arabzadeh E., Panzeri S., Diamond M. E. (2004). Whisker vibration information carried by rat barrel cortex neurons. J. Neurosci. 24, 6011–6020 10.1523/JNEUROSCI.1389-04.2004 - DOI - PMC - PubMed
    1. Averbeck B. B., Latham P. E., Pouget A. (2006). Neural correlations, population coding and computation. Nat. Rev. Neurosci. 7, 358–367 10.1038/nrn1888 - DOI - PubMed
    1. Belitski A., Gretton A., Magri C., Marayama Y., Montemurro M. A., Logothetis N. K., Panzeri S. (2008). Low-frequency local field potentials and spikes in primary visual cortex convey independent visual information. J. Neurosci. 28, 5696–5709 10.1523/JNEUROSCI.0009-08.2008 - DOI - PMC - PubMed
    1. Borst A., Theunissen F. E. (1999). Information theory and neural coding. Nat. Neurosci. 2, 947–957 10.1038/14731 - DOI - PubMed

LinkOut - more resources