Resting brains never rest: computational insights into potential cognitive architectures
- PMID: 23561718
- DOI: 10.1016/j.tins.2013.03.001
Resting brains never rest: computational insights into potential cognitive architectures
Erratum in
-
Resting Brains Never Rest: Computational Insights into Potential Cognitive Architectures: (Trends in Neurosciences 36, 268-274, 2013).Trends Neurosci. 2018 Mar;41(3):161. doi: 10.1016/j.tins.2017.12.007. Epub 2018 Jan 6. Trends Neurosci. 2018. PMID: 29317106 No abstract available.
Abstract
Resting-state networks (RSNs), which have become a main focus in neuroimaging research, can be best simulated by large-scale cortical models in which networks teeter on the edge of instability. In this state, the functional networks are in a low firing stable state while they are continuously pulled towards multiple other configurations. Small extrinsic perturbations can shape task-related network dynamics, whereas perturbations from intrinsic noise generate excursions reflecting the range of available functional networks. This is particularly advantageous for the efficiency and speed of network mobilization. Thus, the resting state reflects the dynamical capabilities of the brain, which emphasizes the vital interplay of time and space. In this article, we propose a new theoretical framework for RSNs that can serve as a fertile ground for empirical testing.
Copyright © 2013 Elsevier Ltd. All rights reserved.
MeSH terms
LinkOut - more resources
Full Text Sources
Other Literature Sources
