Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2019 Sep 23;10(1):96.
doi: 10.1186/s13244-019-0772-0.

Validation of an informatics tool to assess resident's progress in developing reporting skills

Affiliations

Validation of an informatics tool to assess resident's progress in developing reporting skills

Facundo N Diaz et al. Insights Imaging. .

Abstract

Background: Diagnostic radiology residency programs pursuits as main objectives of the development of diagnostic capabilities and written communication skills to answer clinicians' questions of referring clinicians. There has been also an increasing focus on competencies, rather than just education inputs. Then, to show ongoing professional development is necessary for a system to assess and document resident's competence in these areas. Therefore, we propose the implementation of an informatics tool to objectively assess resident's progress in developing diagnostics and reporting skills. We expect to found decreased preliminary report-final report variability within the course of each year of the residency program.

Results: We analyzed 12,162 evaluations from 32 residents (8 residents per year in a 4-year residency program) in a 7-month period. 73.96% of these evaluations belong to 2nd-year residents. We chose two indicators to study the evolution of evaluations: the total of discrepancies over the total of preliminary reports (excluding score 0) and the total of likely to be clinically significant discrepancies (scores 2b, 3b, and 4b) over the total of preliminary reports (excluding score 0). With the analysis of these two indicators over the evaluations of 2nd-year residents, we found a slight decrease in the value of the first indicator and relative stable behavior of the second one.

Conclusions: This tool is useful for objective assessment of reporting skill of radiology residents. It can provide an opportunity for continuing medical education with case-based learning from those cases with clinically significant discrepancies between the preliminary and the final report.

Keywords: Academic performance; Education; Educational measurement; Internship and residency.

PubMed Disclaimer

Conflict of interest statement

The authors declare that they have no competing interests.

Figures

Fig. 1
Fig. 1
Informatics tool interface. An example of the resident evaluation tool in our reporting software. Workflow: the resident saves the preliminary report (a). The attending radiologist interprets the study, opens the resident evaluation application form (b), and qualifies the preliminary report (c, in this case, concurring with interpretation) in order to finally save and sign the study (d)
Fig. 2
Fig. 2
Pie chart showing the distribution of evaluations by year of residency
Fig. 3
Fig. 3
Evolution of scores over time in 2nd-year residents evaluation
Fig. 4
Fig. 4
Evolution of indicators over time in 2nd-year residents evaluation. Discr/Total is the total of discrepancies over the total of preliminary reports (excluding score 0), and b/Total is the total of likely to be clinically significant discrepancies (scores 2b, 3b, 4b) over the total of preliminary reports (also excluding score 0)
Fig. 5
Fig. 5
Evolution of indicators over time in the whole group of residents. Discr/total is the total of discrepancies over the total of preliminary reports (excluding score 0), and b/total is the total of likely to be clinically significant discrepancies (scores 2b, 3b, 4b) over the total of preliminary reports (also excluding score 0)

References

    1. Jackson Valerie P., Cushing Trudie, Abujudeh Hani H., Borgstede James P., Chin Kenneth W., Grimes Charles K., Larson David B., Larson Paul A., Pyatt Robert S., Thorwarth William T. RADPEER™ Scoring White Paper. Journal of the American College of Radiology. 2009;6(1):21–25. doi: 10.1016/j.jacr.2008.06.011. - DOI - PubMed
    1. Kaewlai R, Abujudeh H. Peer review in clinical radiology practice. AJR Am J Roentgenol. 2012;199(2):W158–W162. doi: 10.2214/AJR.11.8143. - DOI - PubMed
    1. Goldberg-Stein Shlomit, Frigini L. Alexandre, Long Scott, Metwalli Zeyad, Nguyen Xuan V., Parker Mark, Abujudeh Hani. ACR RADPEER Committee White Paper with 2016 Updates: Revised Scoring System, New Classifications, Self-Review, and Subspecialized Reports. Journal of the American College of Radiology. 2017;14(8):1080–1086. doi: 10.1016/j.jacr.2017.03.023. - DOI - PubMed
    1. European Society of Radiology (ESR) Undergraduate education in radiology. A white paper by the European Society of Radiology. Insights Imaging. 2011;2(4):363–374. doi: 10.1007/S13244-011-0104-5. - DOI - PMC - PubMed
    1. European Training Curriculum for Radiology. European Society of Radiology. Available via https://www.myesr.org/media/2838. Accessed 01 Feb 2019.

LinkOut - more resources