Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Comparative Study
. 2019 Feb 1;14(1):11.
doi: 10.1186/s13012-019-0853-y.

Comparison of rapid vs in-depth qualitative analytic methods from a process evaluation of academic detailing in the Veterans Health Administration

Affiliations
Comparative Study

Comparison of rapid vs in-depth qualitative analytic methods from a process evaluation of academic detailing in the Veterans Health Administration

Randall C Gale et al. Implement Sci. .

Abstract

Background: It is challenging to conduct and quickly disseminate findings from in-depth qualitative analyses, which can impede timely implementation of interventions because of its time-consuming methods. To better understand tradeoffs between the need for actionable results and scientific rigor, we present our method for conducting a framework-guided rapid analysis (RA) and a comparison of these findings to an in-depth analysis of interview transcripts.

Methods: Set within the context of an evaluation of a successful academic detailing (AD) program for opioid prescribing in the Veterans Health Administration, we developed interview guides informed by the Consolidated Framework for Implementation Research (CFIR) and interviewed 10 academic detailers (clinical pharmacists) and 20 primary care providers to elicit detail about successful features of the program. For the RA, verbatim transcripts were summarized using a structured template (based on CFIR); summaries were subsequently consolidated into matrices by participant type to identify aspects of the program that worked well and ways to facilitate implementation elsewhere. For comparison purposes, we later conducted an in-depth analysis of the transcripts. We described our RA approach and qualitatively compared the RA and deductive in-depth analysis with respect to consistency of themes and resource intensity.

Results: Integrating the CFIR throughout the RA and in-depth analysis was helpful for providing structure and consistency across both analyses. Findings from the two analyses were consistent. The most frequently coded constructs from the in-depth analysis aligned well with themes from the RA, and the latter methods were sufficient and appropriate for addressing the primary evaluation goals. Our approach to RA was less resource-intensive than the in-depth analysis, allowing for timely dissemination of findings to our operations partner that could be integrated into ongoing implementation.

Conclusions: In-depth analyses can be resource-intensive. If consistent with project needs (e.g., to quickly produce information to inform ongoing implementation or to comply with a policy mandate), it is reasonable to consider using RA, especially when faced with resource constraints. Our RA provided valid findings in a short timeframe, enabling identification of actionable suggestions for our operations partner.

Keywords: Academic detailing; CFIR; Implementation framework; Qualitative methods; Rapid analysis; Veterans.

PubMed Disclaimer

Conflict of interest statement

Ethics approval and consent to participate

The evaluation met the definition of quality improvement and was determined by the Institutional Review Board of record, Stanford University, to be non-human subjects research.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Figures

Fig. 1
Fig. 1
(Rapid analytic step 1) Templated summary table used to summarize each interview transcript. Example from academic detailer interview summary table; similar tables were generated for detailed and not detailed providers. IVG interview guide, AD academic detailer/detailing, VA Veterans Affairs, VAMC Veterans Affairs Medical Center, CFIR Consolidated Framework for Implementation Research
Fig. 2
Fig. 2
(Rapid analytic step 2) MS Excel matrix by participant type for identifying themes, sorting, and visual display; populated using information from templated summary table
Fig. 3
Fig. 3
Timeline for conducting rapid and in-depth analysis. Some transcript coding took place as part of CFIR codebook development (i.e., the first 93 days). CFIR Consolidated Framework for Implementation Research

Similar articles

Cited by

References

    1. Riley WT, Glasgow RE, Etheredge L, Abernethy AP. Rapid, responsive, relevant (R3) research: a call for a rapid learning health research enterprise. Clin Transl Med. 2013;2:10. doi: 10.1186/2001-1326-2-10. - DOI - PMC - PubMed
    1. Eccles MP, Mittman B. Welcome to implementation science. Implement Sci. 2006;1:1.
    1. Glasgow RE, Chambers D. Developing robust, sustainable, implementation systems using rigorous, rapid and relevant science. Clin Transl Sci. 2012;5:48–55. doi: 10.1111/j.1752-8062.2011.00383.x. - DOI - PMC - PubMed
    1. Anker M, Guidotti RJ, Orzeszyna S, Sapirie SA, Thuriaux MC. Rapid evaluation methods (REM) of health services performance: methodological observations. Bull World Health Organ. 1993;71:15–21. - PMC - PubMed
    1. McNall M, Foster-Fishman PG. Methods of rapid evaluation, assessment, and appraisal. Am J Eval. 2007;28:151–168. doi: 10.1177/1098214007300895. - DOI

Publication types

MeSH terms

Substances