Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2023 Mar 3;12(1):57.
doi: 10.1038/s41377-023-01104-7.

Deep learning-enabled virtual histological staining of biological samples

Affiliations
Review

Deep learning-enabled virtual histological staining of biological samples

Bijie Bai et al. Light Sci Appl. .

Abstract

Histological staining is the gold standard for tissue examination in clinical pathology and life-science research, which visualizes the tissue and cellular structures using chromatic dyes or fluorescence labels to aid the microscopic assessment of tissue. However, the current histological staining workflow requires tedious sample preparation steps, specialized laboratory infrastructure, and trained histotechnologists, making it expensive, time-consuming, and not accessible in resource-limited settings. Deep learning techniques created new opportunities to revolutionize staining methods by digitally generating histological stains using trained neural networks, providing rapid, cost-effective, and accurate alternatives to standard chemical staining methods. These techniques, broadly referred to as virtual staining, were extensively explored by multiple research groups and demonstrated to be successful in generating various types of histological stains from label-free microscopic images of unstained samples; similar approaches were also used for transforming images of an already stained tissue sample into another type of stain, performing virtual stain-to-stain transformations. In this Review, we provide a comprehensive overview of the recent research advances in deep learning-enabled virtual histological staining techniques. The basic concepts and the typical workflow of virtual staining are introduced, followed by a discussion of representative works and their technical innovations. We also share our perspectives on the future of this emerging field, aiming to inspire readers from diverse scientific fields to further expand the scope of deep learning-enabled virtual histological staining techniques and their applications.

PubMed Disclaimer

Conflict of interest statement

A.O. is the co-founder of a company that commercializes virtual tissue staining technologies.

Figures

Fig. 1
Fig. 1. Schematic of the standard histological staining and deep learning-based virtual staining.
a Standard histological staining relies on laborious chemical-based tissue processing and labeling steps. b Pre-trained deep neural networks enable the virtual histological staining of label-free samples as well as the transformation from one stain type to another, without requiring any additional chemical staining procedures
Fig. 2
Fig. 2. Training and inference of label-free virtual staining networks.
a Training of a label-free virtual staining network using the supervised scheme. Precisely matched input and ground truth image pairs are required, which can be obtained through a multi-stage image registration process. b Training of a label-free virtual staining network using the unsupervised scheme, in which input and ground truth images are not necessarily paired. Cycle-consistency-based learning frameworks are typically used. c Blind inference of a trained virtual staining model. The virtual histology images are rapidly generated from label-free images using a digital computer
Fig. 3
Fig. 3. Training and inference of stain-to-stain transformation networks.
a Training of a stain-to-stain transformation network using the supervised scheme. b Training of a stain-to-stain transformation network using the unsupervised scheme, in which input and ground truth images are not necessarily paired. c Blind inference of a trained stain-to-stain transformation model. Additional histological stain types can be generated from the existing stain, providing additional diagnostic information without altering the current histopathology workflow
Fig. 4
Fig. 4. Examples of label-free virtual staining using different input imaging modalities.
a Virtual H&E, Jones silver, and MT staining using autofluorescence images. b Multiplexed H&E, Jones silver, and MT staining using a single network with autofluorescence images and digital staining matrix as input. c Virtual IHC HER2 staining using autofluorescence images. d Virtual H&E, Jones silver, and MT staining using quantitative phase images (QPI). e Virtual H&E staining using nonlinear multi-modal images. f Virtual H&E staining using bright-field images. g Virtual H&E staining using TA-PARS images. h Virtual acetic acid and H&E staining using in vivo RCM images. All the scale bars represent 100 μm
Fig. 5
Fig. 5. Examples of virtual stain-to-stain transformations.
a Transformation from H&E staining into virtual Jones silver, MT, and PAS staining. b Transformation from H&E staining into virtual IHC Ki-67 staining. c Transformation from Ki-67 IHC staining into multiplexed virtual IF staining. d Transformation from H&E staining into virtual panCK IF staining. e Virtual H&E staining using Hoechst-stained MUSE images. Adapted with permission from ref. © The Optical Society
Fig. 6
Fig. 6. Multi-stage image cross-registration workflow.
The label-free WSI and its corresponding histologically stained WSI are first roughly registered and cropped into coarsely matched image pairs. Then a pseudo virtual staining model is trained to transform the label-free images into the histological images, which assists the local feature registration using an elastic pyramidal registration algorithm
Fig. 7
Fig. 7. Virtual staining network architectures and training scheme.
a For supervised learning, when paired image data are available, GAN and its variants are typically used. When partially paired data are available, a cascaded GAN that optimizes sequential image transformation models can be used. b For unsupervised training with unpaired image data, CycleGAN and its variants are typically used. Adapted with permission from ref. © The Optical Society
Fig. 8
Fig. 8. Evaluation methods for virtual staining neural network models.
a Standard quantitative metrics such as PSNR and SSIM are calculated based on output images and their corresponding ground truth images. b Pathological features are extracted, and the statistical correlations between the features from the virtual and histological staining methods are compared. c The diagnostic values and the staining quality of the virtually generated images are evaluated by expert pathologists and compared against the histologically stained ones. d Validated digital pathology DNN models that can perform downstream diagnostic analysis are used to evaluate the clinically relevant characteristics of virtually and histologically stained images

References

    1. Bancroft, J. D. & Gamble, M. Theory and Practice of Histological Techniques. 6th edn. (Churchill Livingstone, Edinburgh, 2008).
    1. Musumeci G. Past, present and future: overview on histology and histopathology. J. Histol. Histopathol. 2014;1:5.
    1. Titford M. Progress in the development of microscopical techniques for diagnostic pathology. J. Histotechnol. 2009;32:9–19.
    1. Alturkistani HA, Tashkandi FM, Mohammedsaleh ZM. Histological stains: a literature review and case study. Global Journal of Health. Science. 2016;8:72–79. - PMC - PubMed
    1. Gurcan MN, et al. Histopathological image analysis: a review. IEEE Rev. Biomed. Eng. 2009;2:147–171. - PMC - PubMed