Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2020 Aug 28:2020:6805710.
doi: 10.1155/2020/6805710. eCollection 2020.

Current Status and Future Perspectives of Artificial Intelligence in Magnetic Resonance Breast Imaging

Affiliations
Review

Current Status and Future Perspectives of Artificial Intelligence in Magnetic Resonance Breast Imaging

Anke Meyer-Bäse et al. Contrast Media Mol Imaging. .

Abstract

Recent advances in artificial intelligence (AI) and deep learning (DL) have impacted many scientific fields including biomedical maging. Magnetic resonance imaging (MRI) is a well-established method in breast imaging with several indications including screening, staging, and therapy monitoring. The rapid development and subsequent implementation of AI into clinical breast MRI has the potential to affect clinical decision-making, guide treatment selection, and improve patient outcomes. The goal of this review is to provide a comprehensive picture of the current status and future perspectives of AI in breast MRI. We will review DL applications and compare them to standard data-driven techniques. We will emphasize the important aspect of developing quantitative imaging biomarkers for precision medicine and the potential of breast MRI and DL in this context. Finally, we will discuss future challenges of DL applications for breast MRI and an AI-augmented clinical decision strategy.

PubMed Disclaimer

Conflict of interest statement

The authors declare that they have no conflicts of interest.

Figures

Figure 1
Figure 1
Differences between conventional and deep learning in breast MRI for the lesion discrimination task. The upper part of the image represents the traditional radiomic-based processing. Features such as texture, shape, and histogram are fused to describe the tumor. These engineered features are defined based on expert knowledge. They are extracted from an accurate segmentation which may be performed automatically or, more often, in a semiautomatic fashion by an expert radiologist. The lower part shows the DL-based processing. Several deeper layer features from low level (edges) to high level (objects) are automatically learned by the network. This approach does not require an explicit segmentation step and can be directly applied to the raw images, trained only from lesion-level class labels.
Figure 2
Figure 2
Flowchart shows selection of studies for inclusion in the narrative review (a); selected studies are further characterized according to the main focus (b).
Figure 3
Figure 3
Deep learning network with U-net architecture. Reprinted with permission from [34].
Figure 4
Figure 4
Two different approaches for applying U-net to breast and fibroglandular tissue (FGT) segmentation. The upper figure shows 2C U-nets, where two consecutive U-nets are used. The figure below illustrates the other approach, a single U-net with 3-class outputs. Pnb, Pbreast, Pfat, and PFGT denote the probability values of voxels to belong to nonbreast, breast, fat, and FGT, respectively. Reprinted with permission from [34].
Figure 5
Figure 5
Two-step transfer learning approach for leveraging temporal information in pretrained CNN. In the first approach (a), the CNN is fine-tuned on pseudocolor ROIs, formed by the precontrast and first and second postcontrast frames, mimicking the three channels of an RGB images. In the second step (b), image features extracted from the trained CNN at each DCE timepoint are used to train an LSTM network, which learns to distinguish contrast enhancement patterns. Reprinted from [58].
Figure 6
Figure 6
Flow diagram of sentinel lymph node prediction. Reprinted with permission from [113].

References

    1. Mann R. M., Balleyguier C., Balleyguier C., et al. Breast MRI: EUSOBI recommendations for women’s information. European Radiology. 2015;25(12):3669–3678. doi: 10.1007/s00330-015-3807-z. - DOI - PMC - PubMed
    1. National Research Council. Toward Precision Medicine: Building a Knowledge Network for Biomedical Research and a New Taxonomy of Disease. Washington, DC, USA: National Academies Press; 2011. - PubMed
    1. Mitchell T. Machine Learning. New York, NY, USA: McGraw-Hill; 1997.
    1. Wang S., Summers R. M. Machine learning and radiology. Medical Image Analysis. 2012;16(5):933–951. doi: 10.1016/j.media.2012.02.005. - DOI - PMC - PubMed
    1. Aerts H., Velazquez E., Leijenaar R., et al. Decoding tumour phenotype by noninvasive imaging using a quantitative radiomics approach. Nature Communications. 2014;5(1):p. 4644. doi: 10.1038/ncomms5644. - DOI - PMC - PubMed