Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2019 Mar:184:53-68.
doi: 10.1016/j.cognition.2018.12.002. Epub 2018 Dec 18.

The cognitive roots of regularization in language

Affiliations
Free article

The cognitive roots of regularization in language

Vanessa Ferdinand et al. Cognition. 2019 Mar.
Free article

Abstract

Regularization occurs when the output a learner produces is less variable than the linguistic data they observed. In an artificial language learning experiment, we show that there exist at least two independent sources of regularization bias in cognition: a domain-general source based on cognitive load and a domain-specific source triggered by linguistic stimuli. Both of these factors modulate how frequency information is encoded and produced, but only the production-side modulations result in regularization (i.e. cause learners to eliminate variation from the observed input). We formalize the definition of regularization as the reduction of entropy and find that entropy measures are better at identifying regularization behavior than frequency-based analyses. Using our experimental data and a model of cultural transmission, we generate predictions for the amount of regularity that would develop in each experimental condition if the artificial language were transmitted over several generations of learners. Here we find that the effect of cognitive constraints can become more complex when put into the context of cultural evolution: although learning biases certainly carry information about the course of language evolution, we should not expect a one-to-one correspondence between the micro-level processes that regularize linguistic datasets and the macro-level evolution of linguistic regularity.

Keywords: Domain generality; Domain specificity; Frequency learning; Language evolution; Regularisation.

PubMed Disclaimer

Publication types

LinkOut - more resources