Pay for performance, satisfaction and retention in longitudinal crowdsourced research
- PMID: 33471835
- PMCID: PMC7817012
- DOI: 10.1371/journal.pone.0245460
Pay for performance, satisfaction and retention in longitudinal crowdsourced research
Abstract
In the social and cognitive sciences, crowdsourcing provides up to half of all research participants. Despite this popularity, researchers typically do not conceptualize participants accurately, as gig-economy worker-participants. Applying theories of employee motivation and the psychological contract between employees and employers, we hypothesized that pay and pay raises would drive worker-participant satisfaction, performance, and retention in a longitudinal study. In an experiment hiring 359 Amazon Mechanical Turk Workers, we found that initial pay, relative increase of pay over time, and overall pay did not have substantial influence on subsequent performance. However, pay significantly predicted participants' perceived choice, justice perceptions, and attrition. Given this, we conclude that worker-participants are particularly vulnerable to exploitation, having relatively low power to negotiate pay. Results of this study suggest that researchers wishing to crowdsource research participants using MTurk might not face practical dangers such as decreased performance as a result of lower pay, but they must recognize an ethical obligation to treat Workers fairly.
Conflict of interest statement
The authors have declared that no competing interests exist.
Figures
References
-
- Berinsky AJ, Huber GA, Lenz GS. Evaluating online labor markets for experimental research: Amazon.com’s Mechanical Turk. Polit Anal. 2012;20: 351–368. 10.1093/pan/mpr057 - DOI
-
- Landers RN, Behrend TS. An inconvenient truth: Arbitrary distinctions between organizational, Mechanical Turk, and other convenience samples. Ind Organ Psychol. 2015;8: 142–164. 10.1017/iop.2015.13 - DOI
MeSH terms
LinkOut - more resources
Full Text Sources
Other Literature Sources
