Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2022 Jan:75:102288.
doi: 10.1016/j.media.2021.102288. Epub 2021 Nov 6.

Selective identification and localization of indolent and aggressive prostate cancers via CorrSigNIA: an MRI-pathology correlation and deep learning framework

Affiliations

Selective identification and localization of indolent and aggressive prostate cancers via CorrSigNIA: an MRI-pathology correlation and deep learning framework

Indrani Bhattacharya et al. Med Image Anal. 2022 Jan.

Abstract

Automated methods for detecting prostate cancer and distinguishing indolent from aggressive disease on Magnetic Resonance Imaging (MRI) could assist in early diagnosis and treatment planning. Existing automated methods of prostate cancer detection mostly rely on ground truth labels with limited accuracy, ignore disease pathology characteristics observed on resected tissue, and cannot selectively identify aggressive (Gleason Pattern≥4) and indolent (Gleason Pattern=3) cancers when they co-exist in mixed lesions. In this paper, we present a radiology-pathology fusion approach, CorrSigNIA, for the selective identification and localization of indolent and aggressive prostate cancer on MRI. CorrSigNIA uses registered MRI and whole-mount histopathology images from radical prostatectomy patients to derive accurate ground truth labels and learn correlated features between radiology and pathology images. These correlated features are then used in a convolutional neural network architecture to detect and localize normal tissue, indolent cancer, and aggressive cancer on prostate MRI. CorrSigNIA was trained and validated on a dataset of 98 men, including 74 men that underwent radical prostatectomy and 24 men with normal prostate MRI. CorrSigNIA was tested on three independent test sets including 55 men that underwent radical prostatectomy, 275 men that underwent targeted biopsies, and 15 men with normal prostate MRI. CorrSigNIA achieved an accuracy of 80% in distinguishing between men with and without cancer, a lesion-level ROC-AUC of 0.81±0.31 in detecting cancers in both radical prostatectomy and biopsy cohort patients, and lesion-levels ROC-AUCs of 0.82±0.31 and 0.86±0.26 in detecting clinically significant cancers in radical prostatectomy and biopsy cohort patients respectively. CorrSigNIA consistently outperformed other methods across different evaluation metrics and cohorts. In clinical settings, CorrSigNIA may be used in prostate cancer detection as well as in selective identification of indolent and aggressive components of prostate cancer, thereby improving prostate cancer care by helping guide targeted biopsies, reducing unnecessary biopsies, and selecting and planning treatment.

Keywords: Computer-aided diagnosis; Correlated feature learning; Prostate cancer; Radiology-pathology fusion.

PubMed Disclaimer

Conflict of interest statement

Declaration of Competing Interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Figures

Fig. 1:
Fig. 1:
Registered MRI and histopathology slices of a patient in cohort C1, with cancer labels mapped from histopathology to MRI. (a) T2w image, (b) ADC image, (c) Histopathology image, (d) T2w image overlaid with cancer labels from expert pathologist (black outline) and per-pixel histologic grade labels from automated Gleason grading (Ryu et al., 2019): aggressive labels (yellow), indolent labels (green), (e) Processed ground truth labels generated from (d) that were used for training and evaluation of the models (pre-processing of ground truth labels described in Section 2.3.2): aggressive (yellow), indolent (green), aggressive or indolent with equal likelihood (brown).
Fig. 2:
Fig. 2:
Schematic representation of our approach. (a) Complete flowchart for CorrSigNIA, (b) Correlated feature learning module which uses the CorrNet model to learn correlated feature representations from registered MRI and histopathology features, (c) Prostate cancer detection and characterization of aggressiveness module, which uses a modified HED architecture to selectively identify and localize normal tissue, indolent cancer, and aggressive cancer.
Fig. 3:
Fig. 3:
Sextants for lesion-level evaluation. (a) Axial, (b) Sagittal, and (c) Coronal views. The sextant-based approach for evaluating lesions was based on how biopsies are performed clinically. It is very common to do 12-core needle sampling with 2 cores from each sextant.
Fig. 4:
Fig. 4:
Learning correlated feature representations for prostate cancer for the slice shown in Figure 1. Pre-trained features extracted from (a) T2w image, (b) ADC image, (c) histopathology image. Correlated latent representations formed by projecting (d) from MRI features, and (e) from histopathology features. Column (d) shows the learned correlated MRI feature representations (or CorrNet representations) that are used in the subsequent cancer detection task. Red arrow in column (d) points to cancerous lesion.
Fig. 5:
Fig. 5:
Detection and localization of aggressive cancer in a patient in cohort C1 shown from apex (top row) to base (bottom row). Registered (a) T2w image, (b) ADC image, (c) histopathology image, (d) T2w image overlaid with ground truth labels: cancer from expert pathologist (black outline), aggressive cancer (yellow) and indolent cancer (green) histologic grading from (Ryu et al., 2019), pixels within pathologist outline without automated histologic grade labels shown in brown, (e) T2w image overlaid with predicted labels from CorrSigNIA: predicted aggressive cancer (yellow) and predicted indolent cancer (green), black outline represents ground truth pathologist cancer outline.
Fig. 6:
Fig. 6:
Prediction in a man from cohort C3 (without cancer) shown in 4 consecutive slices in the mid gland. (a) T2w, (b) ADC, and (c) CorrSigNIA predictions.
Fig. 7:
Fig. 7:
Qualitative performance comparison between SPCNet (Seetharaman et al., 2021), U-Net, BrU-Net and CorrSigNIA in six different patients from the test set of cohorts C1, C2, and C3. (a) T2w image and (b) ADC image. T2w image overlaid with (c) ground truth (GT) labels: cancer from expert pathologist (black outline), aggressive cancer (yellow) and indolent cancer (green) histologic grade labels from (Ryu et al., 2019). Predicted labels, aggressive cancer (yellow) and indolent cancer (green) from (d) SPCNet, (e) U-Net, and (f) branched U-Net and (g) CorrSigNIA (ours).

References

    1. Abraham B, Nair MS, 2019. Automated grading of prostate cancer using convolutional neural network and ordinal class classifier. Informatics in Medicine Unlocked 17, 100256.
    1. Ahmed HU, Bosaily AES, Brown LC, Gabe R, Kaplan R, Parmar MK, Collaco-Moraes Y, Ward K, Hindley RG, Freeman A, et al., 2017. Diagnostic accuracy of multi-parametric MRI and TRUS biopsy in prostate cancer (PROMIS): A paired validating confirmatory study. The Lancet 389, 815–822. - PubMed
    1. Alkadi R, Taher F, El-Baz A, Werghi N, 2019. A deep learning-based approach for the detection and localization of prostate cancer in t2 magnetic resonance images. Journal of digital imaging 32, 793–807. - PMC - PubMed
    1. Armato SG, Huisman H, Drukker K, Hadjiiski L, Kirby JS, Petrick N, Redmond G, Giger ML, Cha K, Mamonov A, et al., 2018. PROSTATEx Challenges for computerized classification of prostate lesions from multiparametric magnetic resonance images. Journal of Medical Imaging 5, 044501. - PMC - PubMed
    1. Barentsz JO, Weinreb JC, Verma S, Thoeny HC, Tempany CM, Shtern F, Padhani AR, Margolis D, Macura KJ, Haider MA, et al., 2016. Synopsis of the PI-RADS v2 guidelines for multiparametric prostate magnetic resonance imaging and recommendations for use. European Urology 69, 41. - PMC - PubMed

Publication types