Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2015 Apr;204(4):903-8.
doi: 10.2214/AJR.14.12903.

Do mammographic technologists affect radiologists' diagnostic mammography interpretative performance?

Affiliations

Do mammographic technologists affect radiologists' diagnostic mammography interpretative performance?

Louise M Henderson et al. AJR Am J Roentgenol. 2015 Apr.

Abstract

Objective: The purpose of this study was to determine whether the technologist has an effect on the radiologists' interpretative performance of diagnostic mammography.

Materials and methods: Using data from a community-based mammography registry from 1994 to 2009, we identified 162,755 diagnostic mammograms interpreted by 286 radiologists and performed by 303 mammographic technologists. We calculated sensitivity, false-positive rate, and positive predictive value (PPV) of the recommendation for biopsy from mammography for examinations performed (i.e., images acquired) by each mammographic technologist, separately for conventional (film-screen) and digital modalities. We assessed the variability of these performance measures among mammographic technologists, using mixed effects logistic regression and taking into account the clustering of examinations within women, radiologists, and radiology practices.

Results: Among the 291 technologists performing conventional examinations, mean sensitivity of the examinations performed was 83.0% (95% CI, 80.8-85.2%), mean false-positive rate was 8.5% (95% CI, 8.0-9.0%), and mean PPV of the recommendation for biopsy from mammography was 27.1% (95% CI, 24.8-29.4%). For the 45 technologists performing digital examinations, mean sensitivity of the examinations they performed was 79.6% (95% CI, 73.1-86.2%), mean false-positive rate was 8.8% (95% CI, 7.5-10.0%), and mean PPV of the recommendation for biopsy from mammography was 23.6% (95% CI, 18.8-28.4%). We found significant variation by technologist in the sensitivity, false-positive rate, and PPV of the recommendation for biopsy from mammography for conventional but not digital mammography (p < 0.0001 for all three interpretive performance measures).

Conclusion: Our results suggest that the technologist has an influence on radiologists' interpretive performance for diagnostic conventional but not digital mammography. Future studies should examine why this difference between modalities exists and determine if similar patterns are observed for screening mammography.

Keywords: diagnostic mammography; false-positive rate; positive predictive value of biopsy; sensitivity; variability.

PubMed Disclaimer

Conflict of interest statement

Conflicts of Interest

None

Figures

Figure 1
Figure 1
Distribution of the average number of diagnostic mammograms performed per technologist per year; A. film, B. digital
Figure 1
Figure 1
Distribution of the average number of diagnostic mammograms performed per technologist per year; A. film, B. digital
Figure 2
Figure 2
Model-based smoothed histograms of diagnostic mammography performance measures for the 303 technologists by imaging modality (film and digital) with vertical lines at 25th, 50th, and 75th quartiles; A. Sensitivity film B. Sensitivity digital C. False positive rate film D. False positive rate digital E. Positive predictive value of biopsy (PPV2) film F. Positive predictive value of biopsy (PPV2) digital
Figure 2
Figure 2
Model-based smoothed histograms of diagnostic mammography performance measures for the 303 technologists by imaging modality (film and digital) with vertical lines at 25th, 50th, and 75th quartiles; A. Sensitivity film B. Sensitivity digital C. False positive rate film D. False positive rate digital E. Positive predictive value of biopsy (PPV2) film F. Positive predictive value of biopsy (PPV2) digital
Figure 2
Figure 2
Model-based smoothed histograms of diagnostic mammography performance measures for the 303 technologists by imaging modality (film and digital) with vertical lines at 25th, 50th, and 75th quartiles; A. Sensitivity film B. Sensitivity digital C. False positive rate film D. False positive rate digital E. Positive predictive value of biopsy (PPV2) film F. Positive predictive value of biopsy (PPV2) digital
Figure 2
Figure 2
Model-based smoothed histograms of diagnostic mammography performance measures for the 303 technologists by imaging modality (film and digital) with vertical lines at 25th, 50th, and 75th quartiles; A. Sensitivity film B. Sensitivity digital C. False positive rate film D. False positive rate digital E. Positive predictive value of biopsy (PPV2) film F. Positive predictive value of biopsy (PPV2) digital
Figure 2
Figure 2
Model-based smoothed histograms of diagnostic mammography performance measures for the 303 technologists by imaging modality (film and digital) with vertical lines at 25th, 50th, and 75th quartiles; A. Sensitivity film B. Sensitivity digital C. False positive rate film D. False positive rate digital E. Positive predictive value of biopsy (PPV2) film F. Positive predictive value of biopsy (PPV2) digital
Figure 2
Figure 2
Model-based smoothed histograms of diagnostic mammography performance measures for the 303 technologists by imaging modality (film and digital) with vertical lines at 25th, 50th, and 75th quartiles; A. Sensitivity film B. Sensitivity digital C. False positive rate film D. False positive rate digital E. Positive predictive value of biopsy (PPV2) film F. Positive predictive value of biopsy (PPV2) digital

References

    1. Dee KE, Sickles EA. Medical audit of diagnostic mammography examinations: comparison with screening outcomes obtained concurrently. AJR. 2001;176(3):729–33. - PubMed
    1. Sohlich RE, Sickles EA, Burnside ES, Dee KE. Interpreting data from audits when screening and diagnostic mammography outcomes are combined. AJR. 2002;178(3):681–6. - PubMed
    1. Sickles EA, Miglioretti DL, Ballard-Barbash R, et al. Performance benchmarks for diagnostic mammography. Radiology. 2005;235(3):775–90. - PubMed
    1. Ballard-Barbash R, Taplin SH, Yankaskas BC, et al. Breast Cancer Surveillance Consortium: a national mammography screening and outcomes database. AJR. 1997;169(4):1001–8. - PubMed
    1. Jackson SL, Taplin SH, Sickles EA, et al. Variability of interpretive accuracy among diagnostic mammography facilities. J Natl Cancer Inst. 2009;101(11):814–27. - PMC - PubMed

Publication types