Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2009 Nov 30:6:79.
doi: 10.1186/1479-5868-6-79.

Using process evaluation for program improvement in dose, fidelity and reach: the ACT trial experience

Affiliations

Using process evaluation for program improvement in dose, fidelity and reach: the ACT trial experience

Dawn K Wilson et al. Int J Behav Nutr Phys Act. .

Abstract

Background: The purpose of this study was to demonstrate how formative program process evaluation was used to improve dose and fidelity of implementation, as well as reach of the intervention into the target population, in the "Active by Choice Today" (ACT) randomized school-based trial from years 1 to 3 of implementation.

Methods: The intervention integrated constructs from Self-Determination Theory and Social Cognitive Theory to enhance intrinsic motivation and behavioral skills for increasing long-term physical activity (PA) behavior in underserved adolescents (low income, minorities). ACT formative process data were examined at the end of each year to provide timely, corrective feedback to keep the intervention "on track".

Results: Between years 1 and 2 and years 2 and 3, three significant changes were made to attempt to increase dose and fidelity rates in the program delivery and participant attendance (reach). These changes included expanding the staff training, reformatting the intervention manual, and developing a tracking system for contacting parents of students who were not attending the after-school programs regularly. Process outcomes suggest that these efforts resulted in notable improvements in attendance, dose, and fidelity of intervention implementation from years 1 to 2 and 2 to 3 of the ACT trial.

Conclusion: Process evaluation methods, particularly implementation monitoring, are useful tools to ensure fidelity in intervention trials and for identifying key best practices for intervention delivery.

PubMed Disclaimer

Similar articles

Cited by

References

    1. Dusenbury L, Brannigan R, Falco M, Hansen WB. A review of research of fidelity of implementation: Implications for drug abuse prevention in school settings. Health Educ Res. 2003;18:237–256. doi: 10.1093/her/18.2.237. - DOI - PubMed
    1. McGraw SA, Sellers DE, Stone EJ, Bebchuk J, Edmundson E, Johnson C, Buchman K, Luepker R. Using process data to explain outcomes: An illustration from the Child and Adolescent Trial for Cardiovascular Health (CATCH) Eval Rev. 1996;20:291–312. doi: 10.1177/0193841X9602000304. - DOI - PubMed
    1. McGraw SA, Sellers DE, Stone EJ, Resnicow K, Kuester S, Fridinger F, Wechsler H. Monitoring implementation of school programs and policies to promote healthy eating and physical activity among youth. Prev Med. 2000;31:S86–S97. doi: 10.1006/pmed.2000.0648. - DOI
    1. Durlak J, DuPre E. Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. Am J Commun Psychol. 2008;41:327–350. doi: 10.1007/s10464-008-9165-0. - DOI - PubMed
    1. Griffin S, Wilcox S, Ory M, Lattimore D, Leviton L, Castro C, Carpenter R, Rheaume C. Results from Active for Life process evaluation: Program delivery and fidelity. Health Educ Res. 2009. in press . - PubMed

LinkOut - more resources