Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2017 Oct 3;12(1):118.
doi: 10.1186/s13012-017-0649-x.

Toward criteria for pragmatic measurement in implementation research and practice: a stakeholder-driven approach using concept mapping

Affiliations

Toward criteria for pragmatic measurement in implementation research and practice: a stakeholder-driven approach using concept mapping

Byron J Powell et al. Implement Sci. .

Abstract

Background: Advancing implementation research and practice requires valid and reliable measures of implementation determinants, mechanisms, processes, strategies, and outcomes. However, researchers and implementation stakeholders are unlikely to use measures if they are not also pragmatic. The purpose of this study was to establish a stakeholder-driven conceptualization of the domains that comprise the pragmatic measure construct. It built upon a systematic review of the literature and semi-structured stakeholder interviews that generated 47 criteria for pragmatic measures, and aimed to further refine that set of criteria by identifying conceptually distinct categories of the pragmatic measure construct and providing quantitative ratings of the criteria's clarity and importance.

Methods: Twenty-four stakeholders with expertise in implementation practice completed a concept mapping activity wherein they organized the initial list of 47 criteria into conceptually distinct categories and rated their clarity and importance. Multidimensional scaling, hierarchical cluster analysis, and descriptive statistics were used to analyze the data.

Findings: The 47 criteria were meaningfully grouped into four distinct categories: (1) acceptable, (2) compatible, (3) easy, and (4) useful. Average ratings of clarity and importance at the category and individual criteria level will be presented.

Conclusions: This study advances the field of implementation science and practice by providing clear and conceptually distinct domains of the pragmatic measure construct. Next steps will include a Delphi process to develop consensus on the most important criteria and the development of quantifiable pragmatic rating criteria that can be used to assess measures.

PubMed Disclaimer

Conflict of interest statement

Ethics approval and consent to participate

The institutional review board at Indiana University, the University of Montana, and the University of North Carolina at Chapel Hill approved all study procedures. Written informed consent was obtained for all study procedures.

Consent for publication

Not applicable.

Competing interests

MW is Co-Editor-in-Chief, BW is an Associate Editor, and BJP, LJD, and LW are members of the Editorial Board of Implementation Science. None of the authors were involved in any editorial decisions related to this manuscript. The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Figures

Fig. 1
Fig. 1
Point and cluster map of criteria demonstrating spatial relationships (n = 23). This point and cluster map reflects the product of our stakeholders’ (valid response n = 23) sorting the 47 criteria into groups that they deemed conceptually similar. Each strategy is depicted as a dot with a number that corresponds to Table 1. The distances between criteria reflect the frequency at which they were sorted together; thus, strategies that were sorted together frequently are closer together on the map. These spatial relationships are relative to the data in this study and do not reflect an absolute relationship (i.e., a 5-mm distance on this map does not reflect the same relationship as a 5-mm distance on a map from a different dataset) [15]. Items 19 (“sensitive to change”) and 7 (“important to clinical care”) were originally assigned to the “compatible” cluster, but were moved to the “useful” cluster because the investigative team believed that it represented a better conceptual fit. The gray dotted lines within the “useful” cluster and between the “useful” and “compatible” clusters represent how the clusters would have been represented if we had not made this change
Fig. 2
Fig. 2
Mean clarity and importance ratings per cluster (n = 24)
Fig. 3
Fig. 3
Go-zone graph of mean clarity and importance ratings (n = 24). The range of the x- and y-axes reflect the mean values obtained for all 47 of the pragmatic criteria for the clarity and importance rating scales. The plot is divided into quadrants based upon the overall mean values for each rating scale: quadrant I (above the mean for both clarity and importance), quadrant II (above the mean for clarity, below the mean for importance), quadrant III (below the mean for clarity and importance), and quadrant IV (below the mean for clarity, above the mean for importance)

References

    1. Proctor EK, Powell BJ, Feely M. Measurement in dissemination and implementation science. In: Beidas RS, Kendall PC, editors. Dissemination and implementation of evidence-based practices in child and adolescent mental health. New York: Oxford University Press; 2014. pp. 22–43.
    1. Glasgow RE, Riley WT. Pragmatic measures: what they are and why we need them. Am J Prev Med. 2013;45:237–243. doi: 10.1016/j.amepre.2013.03.010. - DOI - PubMed
    1. Lewis CC, Weiner BJ, Stanick C, Fischer SM. Advancing implementation science through measure development and evaluation: a study protocol. Implement Sci. 2015;10:1–10. doi: 10.1186/s13012-014-0195-8. - DOI - PMC - PubMed
    1. Proctor EK, Silmere H, Raghavan R, Hovmand P, Aarons GA, Bunger A, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health Ment Health Serv Res. 2011;38:65–76. doi: 10.1007/s10488-010-0319-7. - DOI - PMC - PubMed
    1. Weiner BJ, Lewis CC, Stanick CS, Powell BJ, Dorsey CN, Clary AS, et al. Psychometric assessment of three newly developed implementation outcome measures. Implement Sci. 2017;12:1–12. doi: 10.1186/s13012-017-0635-3. - DOI - PMC - PubMed

Publication types

LinkOut - more resources