Nudging recommendation algorithms increases news consumption and diversity on YouTube
- PMID: 39618512
- PMCID: PMC11604067
- DOI: 10.1093/pnasnexus/pgae518
Nudging recommendation algorithms increases news consumption and diversity on YouTube
Abstract
Recommendation algorithms profoundly shape users' attention and information consumption on social media platforms. This study introduces a computational intervention aimed at mitigating two key biases in algorithms by influencing the recommendation process. We tackle interest bias, or algorithms creating narrow nonnews and entertainment information diets, and ideological bias, or algorithms directing the more strongly partisan users to like-minded content. Employing a sock-puppet experiment ( sock puppets) alongside a month-long randomized experiment involving 2,142 frequent YouTube users, we investigate if nudging the algorithm by playing videos from verified and ideologically balanced news channels in the background increases recommendations to and consumption of news. We additionally test if providing balanced news input to the algorithm promotes diverse and cross-cutting news recommendations and consumption. We find that nudging the algorithm significantly and sustainably increases both recommendations to and consumption of news and also minimizes ideological biases in recommendations and consumption, particularly among conservative users. In fact, recommendations have stronger effects on users' exposure than users' exposure has on subsequent recommendations. In contrast, nudging the users has no observable effects on news consumption. Increased news consumption has no effects on a range of survey outcomes (i.e. political participation, belief accuracy, perceived and affective polarization, and support for democratic norms), adding to the growing evidence of limited attitudinal effects of on-platform exposure. The intervention does not adversely affect user engagement on YouTube, showcasing its potential for real-world implementation. These findings underscore the influence wielded by platform recommender algorithms on users' attention and information exposure.
Keywords: computational social science; filter bubbles; news exposure; recommendation algorithm; social media.
© The Author(s) 2024. Published by Oxford University Press on behalf of National Academy of Sciences.
Figures





Similar articles
-
Short-term exposure to filter-bubble recommendation systems has limited polarization effects: Naturalistic experiments on YouTube.Proc Natl Acad Sci U S A. 2025 Feb 25;122(8):e2318127122. doi: 10.1073/pnas.2318127122. Epub 2025 Feb 18. Proc Natl Acad Sci U S A. 2025. PMID: 39964709 Free PMC article.
-
Auditing YouTube's recommendation system for ideologically congenial, extreme, and problematic recommendations.Proc Natl Acad Sci U S A. 2023 Dec 12;120(50):e2213020120. doi: 10.1073/pnas.2213020120. Epub 2023 Dec 5. Proc Natl Acad Sci U S A. 2023. PMID: 38051772 Free PMC article.
-
Incentivizing news consumption on social media platforms using large language models and realistic bot accounts.PNAS Nexus. 2024 Aug 23;3(9):pgae368. doi: 10.1093/pnasnexus/pgae368. eCollection 2024 Sep. PNAS Nexus. 2024. PMID: 39285930 Free PMC article.
-
Psychological factors contributing to the creation and dissemination of fake news among social media users: a systematic review.BMC Psychol. 2024 Nov 18;12(1):673. doi: 10.1186/s40359-024-02129-2. BMC Psychol. 2024. PMID: 39558439 Free PMC article.
-
An overview of video recommender systems: state-of-the-art and research issues.Front Big Data. 2023 Oct 30;6:1281614. doi: 10.3389/fdata.2023.1281614. eCollection 2023. Front Big Data. 2023. PMID: 37965498 Free PMC article. Review.
Cited by
-
Short-term exposure to filter-bubble recommendation systems has limited polarization effects: Naturalistic experiments on YouTube.Proc Natl Acad Sci U S A. 2025 Feb 25;122(8):e2318127122. doi: 10.1073/pnas.2318127122. Epub 2025 Feb 18. Proc Natl Acad Sci U S A. 2025. PMID: 39964709 Free PMC article.
-
Engagement, user satisfaction, and the amplification of divisive content on social media.PNAS Nexus. 2025 Mar 5;4(3):pgaf062. doi: 10.1093/pnasnexus/pgaf062. eCollection 2025 Mar. PNAS Nexus. 2025. PMID: 40070432 Free PMC article.
References
-
- DeVito MA. 2017. From editors to algorithms: a values-based approach to understanding story selection in the Facebook news feed. Digit Journal. 5(6):753–773.
-
- Kendall TA, et al. 2009. US Patent No. 12/193,705.
-
- Kendall T, Zhou D. 2010. Leveraging information in a social network for inferential targeting of advertisements. US Patent App. 12/419,958.
-
- Thorson K, Cotter K, Medeiros M, Pak C. 2021. Algorithmic inference, political interest, and exposure to news and politics on facebook. Inf Commun Soc. 24(2):183–200.
-
- Staff WSJ. 2021. Inside Tiktok’s highly secretive algorithm. https://www.wsj.com/video/series/inside-tiktoks-highly-secretive-algorit....