Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2020 Fall;40(4):248-256.
doi: 10.1097/CEH.0000000000000316.

Toward Practice-Based Continuing Education Protocols: Using Testing to Help Physicians Update Their Knowledge

Affiliations

Toward Practice-Based Continuing Education Protocols: Using Testing to Help Physicians Update Their Knowledge

Heather Armson et al. J Contin Educ Health Prof. 2020 Fall.

Abstract

Introduction: Using assessment to facilitate learning is a well-established priority in education but has been associated with variable effectiveness for continuing professional development. What factors modulate the impact of testing in practitioners are unclear. We aimed to improve capacity to support maintenance of competence by exploring variables that influence the value of web-based pretesting.

Methods: Family physicians belonging to a practice-based learning program studied two educational modules independently or in small groups. Before learning sessions they completed a needs assessment and were assigned to either sit a pretest intervention or read a relevant review article. After the learning session, they completed an outcome test, indicated plans to change practice, and subsequently documented changes made.

Results: One hundred twelve physicians completed the study, 92 in small groups. The average lag between tests was 6.3 weeks. Relative to those given a review article, physicians given a pretest intervention: (1) reported spending less time completing the assigned task (16.7 versus 25.7 minutes); (2) performed better on outcome test questions that were repeated from the pretest (65.9% versus 58.7%); and (3) when the learning module was completed independently, reported making a greater proportion of practice changes to which they committed (80.0% versus 45.0%). Knowledge gain was unrelated to physicians' stated needs.

Discussion: Low-stakes formative quizzes, delivered with feedback, can influence the amount of material practicing physicians remember from an educational intervention independent of perceptions regarding the need to engage in continuing professional development on the particular topic.

PubMed Disclaimer

References

    1. Eva KW, Regehr G, Gruppen LD. Blinded by “insight”: self-assessment and its role in performance improvement. In: Hodges BD, Lingard L, eds. The Question of Competence: Reconsidering Medical Education in the Twenty-First Century. New York, NY: Cornell University Press; 2012:131–154.
    1. Davis DA, Mazmanian PE, Fordis M, et al. Accuracy of physician self-assessment compared with observed measures of competence: a systematic review. JAMA. 2006;288:1057–1060.
    1. Dunning D, Heath C, Suls JM. Flawed self-assessment: implications for health, education, and the workplace. Psychol Sci Public Interest. 2004;5:69–106.
    1. Sargeant J, Mann K, van der Vleuten C, et al. “Directed” self-assessment: practice and feedback within a social context. J Contin Educ Health Prof. 2008;28:47–57.
    1. Sargeant JM, Mann KV, Ferrier SN, et al. Responses of rural family physicians and their colleague and coworker raters to a multi-source feedback process: a pilot study. Acad Med. 2003;78:S42–S44.

Publication types

MeSH terms

Substances

LinkOut - more resources