Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2012 Aug;9(4):045011.
doi: 10.1088/1478-3975/9/4/045011. Epub 2012 Aug 7.

The application of information theory to biochemical signaling systems

Affiliations

The application of information theory to biochemical signaling systems

Alex Rhee et al. Phys Biol. 2012 Aug.

Abstract

Cell signaling can be thought of fundamentally as an information transmission problem in which chemical messengers relay information about the external environment to the decision centers within a cell. Due to the biochemical nature of cellular signal transduction networks, molecular noise will inevitably limit the fidelity of any messages received and processed by a cell's signal transduction networks, leaving it with an imperfect impression of its environment. Fortunately, Shannon's information theory provides a mathematical framework independent of network complexity that can quantify the amount of information that can be transmitted despite biochemical noise. In particular, the channel capacity can be used to measure the maximum number of stimuli a cell can distinguish based upon the noisy responses of its signaling systems. Here, we provide a primer for quantitative biologists that covers fundamental concepts of information theory, highlights several key considerations when experimentally measuring channel capacity, and describes successful examples of the application of information theoretic analysis to biological signaling.

PubMed Disclaimer

Figures

Figure 1
Figure 1. (A) Noise can limit the amount of information a cell can obtain about a stimulus
The magnitude of noise is evidenced in the breadth of the probability distribution of the response to a given stimulus. For sufficiently large noise, a cell which can encounter strong or weak stimuli cannot use its response to discern which stimulus was encountered with absolute precision. Consequently, from the cell’s perspective, noise leads to a loss of information about the input. The amount of mutual information between the stimulus and cellular response also suffers such that the greater the overlap between distributions, the less mutual information is communicated. (B) Entropy can be understood as a measure of dispersion. A wider probability distribution corresponds to an increase in the uncertainty of the cellular response and consequently, entropy.
Figure 2
Figure 2. (A) Schematic of a communication channel
A basic communication channel can be described by an input random variable S connected by a channel to a random variable output R such that the outcome of R is dependent on S subject to the distorting influence of noise. In information theory the complexity of the channel can be represented as a “black box”, since the internal details are fully captured by the joint distribution between R and S. (B) Entropy as a function of a Bernoulli random variable with probability p. This concave down graph illustrates that entropy is at its maximum when all outcomes are equally probable (p = 0.5) and at a minimum when the outcome is predetermined (p = 0 or 1).
Figure 3
Figure 3. (A) Quantifying a neural spike train as a scalar or vector
Neural activity consists of intermittent spikes known as action potentials. A series of spikes is known as a neural spike train. data spike train can be quantified as the total number of spikes over a given time period giving a scalar output. Alternatively, time can be divided into small time intervals such that the number of spikes occurring in each time interval is 1 or 0, enabling the spike train to be quantified as a binary vector output. As the total time frame is made longer, the vector becomes longer, and it becomes increasingly harder to adequately sample all possibilities in the entire vector space. (B) Bicoid and hunchback gradient in the Drosophila melanogaster embryo. In the developing embryo of Drosophila melanogaster, pre-deposited bicoid maternal mRNA is translated into a bicoid protein gradient along the anterior-posterior axis. Because bicoid is a cooperative transcriptional activator of hunchback, the smooth bicoid gradient leads to expression of hunchback in a much sharper concentration gradient which delineates the anterior and posterior halves of the embryo. (C) Schematic of the TNF signaling network. Individually, the capacities of the TNF-ATF-2 and the TNF-α NF-κB pathways are only ~0.9 bits of information. Combined, the network of pathways has only a marginally increased capacity of ~1.05 bits. Further investigation found that the capacity was limited at the receptor level at ~1.25 bits implying that the maximum capacity of the TNF network is ~1.25 bits regardless of the number of pathways or branch fidelity.

Similar articles

Cited by

References

    1. Perkins TJ, Swain PS. Strategies for cellular decision-making. Mol Syst Biol. 2009:5. - PMC - PubMed
    1. Albeck JG, et al. Modeling a Snap-Action, Variable-Delay Switch Controlling Extrinsic Cell Death. PLoS Biol. 2008;6(12):e299. - PMC - PubMed
    1. Rosenfeld N, et al. Gene Regulation at the Single-Cell Level. Science. 2005;307(5717):1962–1965. - PubMed
    1. Borst A, Theunissen FE. Information theory and neural coding. Nat Neurosci. 1999;2(11):947–957. - PubMed
    1. Tkacik G, Callan CG, Bialek W. Information capacity of genetic regulatory elements. Physical Review E. 2008;78(1):011910. - PMC - PubMed

Publication types

LinkOut - more resources