Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2003 Aug;15(8):1843-64.
doi: 10.1162/08997660360675062.

Neural representation of probabilistic information

Affiliations

Neural representation of probabilistic information

M J Barber et al. Neural Comput. 2003 Aug.

Abstract

It has been proposed that populations of neurons process information in terms of probability density functions (PDFs) of analog variables. Such analog variables range, for example, from target luminance and depth on the sensory interface to eye position and joint angles on the motor output side. The requirement that analog variables must be processed leads inevitably to a probabilistic description, while the limited precision and lifetime of the neuronal processing units lead naturally to a population representation of information. We show how a time-dependent probability density rho(x; t) over variable x, residing in a specified function space of dimension D, may be decoded from the neuronal activities in a population as a linear combination of certain decoding functions phi(i)(x), with coefficients given by the N firing rates a(i)(t) (generally with D << N). We show how the neuronal encoding process may be described by projecting a set of complementary encoding functions phi;(i)(x) on the probability density rho(x; t), and passing the result through a rectifying nonlinear activation function. We show how both encoders phi;(i)(x) and decoders phi(i)(x) may be determined by minimizing cost functions that quantify the inaccuracy of the representation. Expressing a given computation in terms of manipulation and transformation of probabilities, we show how this representation leads to a neural circuit that can carry out the required computation within a consistent Bayesian framework, with the synaptic weights being explicitly generated in terms of encoders, decoders, conditional probabilities, and priors.

PubMed Disclaimer

Publication types

LinkOut - more resources