Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2020 Dec;21(3):103-156.
doi: 10.1177/1529100620946707.

Citizens Versus the Internet: Confronting Digital Challenges With Cognitive Tools

Affiliations
Review

Citizens Versus the Internet: Confronting Digital Challenges With Cognitive Tools

Anastasia Kozyreva et al. Psychol Sci Public Interest. 2020 Dec.

Abstract

The Internet has evolved into a ubiquitous and indispensable digital environment in which people communicate, seek information, and make decisions. Despite offering various benefits, online environments are also replete with smart, highly adaptive choice architectures designed primarily to maximize commercial interests, capture and sustain users' attention, monetize user data, and predict and influence future behavior. This online landscape holds multiple negative consequences for society, such as a decline in human autonomy, rising incivility in online conversation, the facilitation of political extremism, and the spread of disinformation. Benevolent choice architects working with regulators may curb the worst excesses of manipulative choice architectures, yet the strategic advantages, resources, and data remain with commercial players. One way to address some of this imbalance is with interventions that empower Internet users to gain some control over their digital environments, in part by boosting their information literacy and their cognitive resistance to manipulation. Our goal is to present a conceptual map of interventions that are based on insights from psychological science. We begin by systematically outlining how online and offline environments differ despite being increasingly inextricable. We then identify four major types of challenges that users encounter in online environments: persuasive and manipulative choice architectures, AI-assisted information architectures, false and misleading information, and distracting environments. Next, we turn to how psychological science can inform interventions to counteract these challenges of the digital world. After distinguishing among three types of behavioral and cognitive interventions-nudges, technocognition, and boosts-we focus on boosts, of which we identify two main groups: (a) those aimed at enhancing people's agency in their digital environments (e.g., self-nudging, deliberate ignorance) and (b) those aimed at boosting competencies of reasoning and resilience to manipulation (e.g., simple decision aids, inoculation). These cognitive tools are designed to foster the civility of online discourse and protect reason and human autonomy against manipulative choice architectures, attention-grabbing techniques, and the spread of false information.

Keywords: attention economy; behavioral policy; boosting; choice architecture; cognitive tools; decision aids; disinformation; false news; media literacy; nudging.

PubMed Disclaimer

Conflict of interest statement

Declaration of Conflicting Interests: The author(s) declared that there were no conflicts of interest with respect to the authorship or the publication of this article.

Figures

Fig. 1.
Fig. 1.
Entry points for policy interventions in the digital world: legal and ethical, technological, educational, and socio-psychological. Each entry point is shown with examples of potential policy measures and interventions. Entry points can inform each other; for instance, an understanding of psychological processes can contribute to the design of interventions for any entry point, and regulatory solutions can directly constrain and inform the design of technological and educational agendas. Icons are used under license from Adobe Stock.
Fig. 2.
Fig. 2.
Challenges in the digital world.
Fig. 3.
Fig. 3.
Categories and types of dark patterns. Source and visual materials: Dark Patterns Project at Princeton University (https://webtransparency.cs.princeton.edu/dark-patterns); see also Mathur et al. (2019). The icons are used with permission of the Dark Patterns Project.
Fig. 4.
Fig. 4.
Examples of AI-assisted information architectures online. Icons are used under license from Adobe Stock.
Fig. 5.
Fig. 5.
Main types of false and misleading information in the digital world. Icons are used under license from Adobe Stock.
Fig. 6.
Fig. 6.
Main sources and strategies of false and misleading information in the digital world. Icons are used under license from Adobe Stock.
Fig. 7.
Fig. 7.
Four classes of schedules of reinforcement. The operant-conditioning chamber (also known as the Skinner box) was used to study animal behavior by teaching an animal (e.g., a rat) to perform certain actions (e.g., pressing a lever) in response to a controlling stimulus (e.g., a light signal) reinforced by a reward (e.g., food). Different schedules of reinforcement were studied to see which would create steady and high rates of response behavior. By analogy, “virtual Skinner boxes,” such as social media or online gaming offer their users rewards (e.g., likes or reaching another level in a game) at varying intervals to reinforce and maintain the desired behavior. Icons are used under license from Adobe Stock.
Fig. 8.
Fig. 8.
Types of behavioral and cognitive interventions for the digital world. The “nudging” icon is used under an Attribution 3.0 Unported (CC BY 3.0) license granted by Luis Prado at thenounproject.com. Other icons are used under license from Adobe Stock.
Fig. 9.
Fig. 9.
Map of boosting interventions for the digital world.
Fig. 10.
Fig. 10.
Self-nudging interventions in online environments. A summary of potential self-nudging interventions to enhance people’s control over their digital environments and their privacy protection online. Based in part on Center for Humane Technology (2019) and Epstein (2017). Icons are used under license from Adobe Stock.
Fig. 11.
Fig. 11.
A simple lateral-reading boost. Based on research by the Stanford History Education Group (Breakstone et al., 2018; McGrew et al., 2019; Wineburg & McGrew, 2019). Icons are used under license from Adobe Stock.
Fig. 12.
Fig. 12.
“Can you trust this information?”: This fast-and-frugal decision tree provides users with three crucial steps for evaluating the trustworthiness of information online. Based on research by the Stanford History Education Group (Breakstone et al., 2018; Wineburg & McGrew, 2019). Icons are used under license from Adobe Stock.
Fig. 13.
Fig. 13.
Map of challenges and boosts in the digital world. Icons are used under license from Adobe Stock.

References

    1. Acquisti A., Brandimarte L., Loewenstein G. (2015). Privacy and human behavior in the age of information. Science, 347, 509–514. 10.1126/science.aaa1465 - DOI - PubMed
    1. Adams T. (2020, April 26). 5G, coronavirus and contagious superstition. The Guardian. https://www.theguardian.com/world/2020/apr/26/5g-coronavirus-and-contagi...
    1. Aghaei S., Nematbakhsh M. A., Farsani H. K. (2012). Evolution of the world wide web: From WEB 1.0 TO WEB 4.0. International Journal of Web & Semantic Technology, 3(1), 1–10. 10.5121/ijwest.2012.3101 - DOI
    1. Aisch G., Huang J., Kang C. (2016, December 10). Dissecting the #PizzaGate conspiracy theories. The New York Times. https://www.nytimes.com/interactive/2016/12/10/business/media/pizzagate....
    1. Alexa. (2020). youtube.com competitive analysis, marketing mix and traffic. https://www.alexa.com/siteinfo/youtube.com

LinkOut - more resources