Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2005 Aug 11:2:29.
doi: 10.1186/1742-4682-2-29.

Three subsets of sequence complexity and their relevance to biopolymeric information

Affiliations
Review

Three subsets of sequence complexity and their relevance to biopolymeric information

David L Abel et al. Theor Biol Med Model. .

Abstract

Genetic algorithms instruct sophisticated biological organization. Three qualitative kinds of sequence complexity exist: random (RSC), ordered (OSC), and functional (FSC). FSC alone provides algorithmic instruction. Random and Ordered Sequence Complexities lie at opposite ends of the same bi-directional sequence complexity vector. Randomness in sequence space is defined by a lack of Kolmogorov algorithmic compressibility. A sequence is compressible because it contains redundant order and patterns. Law-like cause-and-effect determinism produces highly compressible order. Such forced ordering precludes both information retention and freedom of selection so critical to algorithmic programming and control. Functional Sequence Complexity requires this added programming dimension of uncoerced selection at successive decision nodes in the string. Shannon information theory measures the relative degrees of RSC and OSC. Shannon information theory cannot measure FSC. FSC is invariably associated with all forms of complex biofunction, including biochemical pathways, cycles, positive and negative feedback regulation, and homeostatic metabolism. The algorithmic programming of FSC, not merely its aperiodicity, accounts for biological organization. No empirical evidence exists of either RSC of OSC ever having produced a single instance of sophisticated biological organization. Organization invariably manifests FSC rather than successive random events (RSC) or low-informational self-ordering phenomena (OSC).

PubMed Disclaimer

Figures

Figure 1
Figure 1
The inverse relationship between order and complexity as demonstrated on a linear vector progression from high order toward greater complexity (modified from [93]).
Figure 2
Figure 2
The adding of a second dimension to Figure 2 allows visualization of the relationship of Kolmogorov algorithmic compressibility to complexity. The more highly ordered (patterned) a sequence, the more highly compressible that sequence becomes. The less compressible a sequence, the more complex is that sequence. A random sequence manifests no Kolmogorov compressibility. This reality serves as the very definition of a random, highly complex string.
Figure 3
Figure 3
Shannon's original 1948 communication diagram is here modified with an oval superimposed over the limits of Shannon's actual research. Shannon never left the confines of this oval to address the essence of meaningful communication. Any theory of Instruction would need to extend outside of the oval to quantify the ideal function and indirect "meaning" of any message.
Figure 4
Figure 4
Superimposition of Functional Sequence Complexity onto Figure 2. The Y1 axis plane plots the decreasing degree of algorithmic compressibility as complexity increases from order towards randomness. The Y2 (Z) axis plane shows where along the same complexity gradient (X-axis) that highly instructional sequences are generally found. The Functional Sequence Complexity (FSC) curve includes all algorithmic sequences that work at all (W). The peak of this curve (w*) represents "what works best." The FSC curve is usually quite narrow and is located closer to the random end than to the ordered end of the complexity scale. Compression of an instructive sequence slides the FSC curve towards the right (away from order, towards maximum complexity, maximum Shannon uncertainty, and seeming randomness) with no loss of function.

References

    1. Shannon CE, Weaver W. The Mathematical Theory of Communication. Urbana, IL , University of Illinois Press; 1949.
    1. Shannon C. Part I and II: A mathematical theory of communication. The Bell System Technical Journal. 1948;XXVII:379–423.
    1. Shannon C. Part III: A mathematical theory of communication. The Bell System Technical Journal. 1948;XXVII:623–656.
    1. Kolmogorov AN. Three approaches to the definition of the concept "quantity of information". Problems Inform Transmission. 1965;1:1–7.
    1. Li M, Vitanyi P. An Introduction to Kolmogorov Complexity and Its Applications. In: Gries DSFB, editor. Graduate Texts in Computer Science. 2. New York , Springer-Verlag; 1997. p. 637.

Publication types

LinkOut - more resources