Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2012:2:514.
doi: 10.1038/srep00514. Epub 2012 Jul 19.

Information processing capacity of dynamical systems

Affiliations

Information processing capacity of dynamical systems

Joni Dambre et al. Sci Rep. 2012.

Abstract

Many dynamical systems, both natural and artificial, are stimulated by time dependent external signals, somehow processing the information contained therein. We demonstrate how to quantify the different modes in which information can be processed by such systems and combine them to define the computational capacity of a dynamical system. This is bounded by the number of linearly independent state variables of the dynamical system, equaling it if the system obeys the fading memory condition. It can be interpreted as the total number of linearly independent functions of its stimuli the system can compute. Our theory combines concepts from machine learning (reservoir computing), system modeling, stochastic processes, and functional analysis. We illustrate our theory by numerical simulations for the logistic map, a recurrent neural network, and a two-dimensional reaction diffusion system, uncovering universal trade-offs between the non-linearity of the computation and the system's short-term memory.

PubMed Disclaimer

Figures

Figure 1
Figure 1. Total measured capacity CTOT for the logistic map (left) and an ESN with 50 nodes (right) as a function of the gain parameter ρ and for three different values of the input scaling parameter ι.
The edge of stability of the undriven system is indicated in both figures by the dotted line.
Figure 2
Figure 2. Breakdown of total measured capacity according to the degree of the basis function as a function of the parameter ι for ESN and logistic map, and Ts for RD system.
The values of ρ (0.95 for ESN and 2.5 for logistic map) were chosen close to the edge of stability, a region in which the most useful processing often occurs . The scale bar indicates the degree of the polynomial. Capacities for polynomials up to degree 9 were measured, but the higher degree contributions are too small to appear in the plots. Note how, when the parameters ι and TS increase, the systems become increasingly nonlinear. Due to the fact that the hyperbolic tangent is an odd function and the input is unbiased, the capacities for the ESN essentially vanish for even degrees.
Figure 3
Figure 3. Trade-off between linear memory L[C] (red, dashed, left axis) and nonlinearity NL[C] (blue, full line, right axis), for ESN (ρ = 0.95), logistic map (ρ = 2.5) and Reaction Diffusion systems as the parameters ι and Ts are changed.
Figure 4
Figure 4. Decrease of total capacity CTOT due to noise, for an ESN with ρ = 0.95, for different values of ι, corresponding to varying degree of nonlinearity.
In the left panel there are two i.i.d. inputs, the signal u and the noise v. The horizontal axis is the input signal to noise ratio in dB (formula image). The fraction of the total capacity which is usable increases when the SNR increases, and decreases when the system becomes more non-linear (increasing ι). In the right panel there are a varying number K of i.i.d. inputs with equal power, the total power being kept constant as K varies. The capacity for computing functions of a single input in the presence of multiple inputs is plotted. The black line indicates the situation for a strictly linear system, where the capacity for a single input should equal N/K.

References

    1. Arbib M. (Ed.) The handbook of brain theory and neural networks, second edition. The MIT Press, Cambridge MA (2003).
    1. Maass W., Natschläger T. & Markram H. Real-time computing without stable states: a new framework for neural computation based on perturbations. Neural Comp. 14, 2531–2560 (2002). - PubMed
    1. Jaeger H. & Haas H. Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless communication. Science 304, 78–80 (2004). - PubMed
    1. Verstraeten D., Schrauwen B., D'Haene M. & Stroobandt D. An experimental unification of reservoir computing methods. Neural Networks 20, 391–403 (2007). - PubMed
    1. Vandoorne K. et al. Toward optical signal processing using Photonic Reservoir Computing. Optics Express 16(15), 11182–11192 (2008). - PubMed

Publication types