Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Meta-Analysis
. 2019 Jul 25;21(7):e13315.
doi: 10.2196/13315.

Impact of Clinicians' Use of Electronic Knowledge Resources on Clinical and Learning Outcomes: Systematic Review and Meta-Analysis

Affiliations
Meta-Analysis

Impact of Clinicians' Use of Electronic Knowledge Resources on Clinical and Learning Outcomes: Systematic Review and Meta-Analysis

Lauren A Maggio et al. J Med Internet Res. .

Abstract

Background: Clinicians use electronic knowledge resources, such as Micromedex, UpToDate, and Wikipedia, to deliver evidence-based care and engage in point-of-care learning. Despite this use in clinical practice, their impact on patient care and learning outcomes is incompletely understood. A comprehensive synthesis of available evidence regarding the effectiveness of electronic knowledge resources would guide clinicians, health care system administrators, medical educators, and informaticians in making evidence-based decisions about their purchase, implementation, and use.

Objective: The aim of this review is to quantify the impact of electronic knowledge resources on clinical and learning outcomes.

Methods: We searched MEDLINE, Embase, PsycINFO, and the Cochrane Library for articles published from 1991 to 2017. Two authors independently screened studies for inclusion and extracted outcomes related to knowledge, skills, attitudes, behaviors, patient effects, and cost. We used random-effects meta-analysis to pool standardized mean differences (SMDs) across studies.

Results: Of 10,811 studies screened, we identified 25 eligible studies published between 2003 and 2016. A total of 5 studies were randomized trials, 22 involved physicians in practice or training, and 10 reported potential conflicts of interest. A total of 15 studies compared electronic knowledge resources with no intervention. Of these, 7 reported clinician behaviors, with a pooled SMD of 0.47 (95% CI 0.27 to 0.67; P<.001), and 8 reported objective patient effects with a pooled SMD of 0.19 (95% CI 0.07 to 0.32; P=.003). Heterogeneity was large (I2>50%) across studies. When compared with other resources-7 studies, not amenable to meta-analytic pooling-the use of electronic knowledge resources was associated with increased frequency of answering questions and perceived benefits on patient care, with variable impact on time to find an answer. A total of 2 studies compared different implementations of the same electronic knowledge resource.

Conclusions: Use of electronic knowledge resources is associated with a positive impact on clinician behaviors and patient effects. We found statistically significant associations between the use of electronic knowledge resources and improved clinician behaviors and patient effects. When compared with other resources, the use of electronic knowledge resources was associated with increased success in answering clinical questions, with variable impact on speed. Comparisons of different implementation strategies of the same electronic knowledge resource suggest that there are benefits from allowing clinicians to choose to access the resource, versus automated display of resource information, and from integrating patient-specific information. A total of 4 studies compared different commercial electronic knowledge resources, with variable results. Resource implementation strategies can significantly influence outcomes but few studies have examined such factors.

Keywords: clinical decision support; educational technology; health information technology; information systems; medical education.

PubMed Disclaimer

Conflict of interest statement

Conflicts of Interest: In 2016, LAM received travel funds to deliver a lecture on evidence-based medicine for employees of Ebsco, the parent company of DynaMed; Ebsco did not have any involvement in the conduct of this study. We are unaware of any other conflicts of interest.

Figures

Figure 1
Figure 1
Trial flowchart.
Figure 2
Figure 2
Comparative usage of electronic knowledge resources versus no intervention. Knowledge outcome analyses are weighted by user, while behavior and patient effects analyses are weighted by patients or hospitals. “a” denotes a locally developed resource; “b” is the number of hospitals, not patients; “c” indicates no comparison group (ie, one-group, pre-/postintervention study). Abx Guide: Johns Hopkins Antibiotic Guide; Ang Soft: angina software; CEM: clinical evidence module; eAAP: Emergency Asthma Action Plan; Epoc: Epocrates; GRAIDS: Genetic Risk Assessment on the Internet with Decision Support; InfoRet: InfoRetriever; MD: practicing physicians; MOC: Maintenance of Certification; MS: medical students; NP: nurse practitioners; ns: not specified; PG: residents; PIER: Physicians’ Information and Education Resource; Rep Sup: Report Support; SCAMP: Standardized Clinical Assessment and Management Plans; UTD: UpToDate.
Figure 3
Figure 3
Impact of electronic knowledge resources in comparison with other resources (Panel A) and alternate electronic knowledge resources (Panel B). All analyses are weighted by patients except as noted. “a” refers to analysis weighted by users; “b” means the comparison group (ie, study data) is the same for these contrasts; “c” means the comparison type “Mixed” indicates a comparison with both electronic and nonelectronic knowledge resources; “d” means the comparison type “Any other” indicates users could select any resource, except the ones it was being compared against; “e” denotes a locally developed resource. 5-min: 5-Minute Clinical Consult; AccessMed: AccessMedicine; ARUSC: Antibiotic Utilization and Surveillance-Control; Clin Evid: clinical evidence; Epoc: Epocrates; InfoRet: InfoRetriever; K: Knowledge; MD: practicing physicians; MMX: Micromedex; MS: medical students; NOS: not otherwise specified; NP: nurse practitioners; ns: not specified; PG: residents; Q: question; rec: recommendation; spec: specific; Taras: Tarascon Pharmacopeia; Trip: Turning Research Into Practice; UTD: UpToDate; Wiki: Wikipedia.

References

    1. Del Fiol G, Workman TE, Gorman PN. Clinical questions raised by clinicians at the point of care: A systematic review. JAMA Intern Med. 2014 May;174(5):710–718. doi: 10.1001/jamainternmed.2014.368.1846630 - DOI - PubMed
    1. Cook DA, Sorensen KJ, Hersh W, Berger RA, Wilkinson JM. Features of effective medical knowledge resources to support point of care learning: A focus group study. PLoS One. 2013;8(11):e80318. doi: 10.1371/journal.pone.0080318. http://dx.plos.org/10.1371/journal.pone.0080318 PONE-D-13-33090 - DOI - PMC - PubMed
    1. Cook DA, Sorensen KJ, Wilkinson JM, Berger RA. Barriers and decisions when answering clinical questions at the point of care: A grounded theory study. JAMA Intern Med. 2013 Nov 25;173(21):1962–1969. doi: 10.1001/jamainternmed.2013.10103.1731968 - DOI - PubMed
    1. Straus SE, Glasziou P, Richardson WS, Haynes RB. Evidence-Based Medicine: How to Practice and Teach EBM. 5th edition. New York, NY: Elsevier; 2019.
    1. Aakre CA, Pencille LJ, Sorensen KJ, Shellum JL, Del Fiol G, Maggio LA, Prokop LJ, Cook DA. Electronic knowledge resources and point-of-care learning: A scoping review. Acad Med. 2018 Nov;93(11S Association of American Medical Colleges Learn Serve Lead: Proceedings of the 57th Annual Research in Medical Education Sessions):S60–S67. doi: 10.1097/ACM.0000000000002375.00001888-201811001-00010 - DOI - PubMed

Publication types

LinkOut - more resources