Methodological Credibility: An Empirical Investigation of the Public's Perception of Evaluation Findings and Methods
- PMID: 27460880
- DOI: 10.1177/0193841X16657728
Methodological Credibility: An Empirical Investigation of the Public's Perception of Evaluation Findings and Methods
Abstract
Background: When evaluations are broadly disseminated, the public can use them to support a program or to advocate for change.
Methods: To explore how evaluations are perceived and used by the public, individuals in a sample of 425 people in the United States were recruited through an online crowdsourcing service called Mechanical Turk (www.mturk.com). Participants were randomly assigned to receive different versions of a press release describing a summative evaluation of a program. Each condition contained a unique combination of methods (e.g., randomized controlled design) and findings (positive or negative) to describe the evaluation and its findings. Participants in each condition responded to questions about their trust in the content of the evaluation findings and their attitudes toward the program.
Results: Results indicated that the type of evaluation methods and the direction of the findings both influenced the credibility of the findings and that the credibility of the findings moderated the relationship between the direction of the evaluation findings and attitudes toward the evaluated program. Additional evaluation factors to explore in future research with the public are recommended.
Keywords: credibility; evaluation influence; evaluation methodology; evaluation use.
© The Author(s) 2016.
Similar articles
-
Incorporating public values into evaluative criteria: Using crowdsourcing to identify criteria and standards.Eval Program Plann. 2018 Dec;71:68-82. doi: 10.1016/j.evalprogplan.2018.08.004. Epub 2018 Aug 8. Eval Program Plann. 2018. PMID: 30165260
-
Towards program theory validation: Crowdsourcing the qualitative analysis of participant experiences.Eval Program Plann. 2018 Feb;66:183-194. doi: 10.1016/j.evalprogplan.2017.08.008. Epub 2017 Aug 19. Eval Program Plann. 2018. PMID: 28919291
-
The Public's Perception of the Severity and Global Impact at the Start of the SARS-CoV-2 Pandemic: A Crowdsourcing-Based Cross-Sectional Analysis.J Med Internet Res. 2020 Nov 26;22(11):e19768. doi: 10.2196/19768. J Med Internet Res. 2020. PMID: 33108314 Free PMC article.
-
Using Mechanical Turk for research on cancer survivors.Psychooncology. 2017 Oct;26(10):1593-1603. doi: 10.1002/pon.4173. Epub 2016 Jun 10. Psychooncology. 2017. PMID: 27283906
-
The use of crowdsourcing in addiction science research: Amazon Mechanical Turk.Exp Clin Psychopharmacol. 2019 Feb;27(1):1-18. doi: 10.1037/pha0000235. Epub 2018 Nov 29. Exp Clin Psychopharmacol. 2019. PMID: 30489114 Review.
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources
Other Literature Sources