Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2022 Jul 5;13(8):4087-4101.
doi: 10.1364/BOE.461411. eCollection 2022 Aug 1.

ADS-Net: attention-awareness and deep supervision based network for automatic detection of retinopathy of prematurity

Affiliations

ADS-Net: attention-awareness and deep supervision based network for automatic detection of retinopathy of prematurity

Yuanyuan Peng et al. Biomed Opt Express. .

Abstract

Retinopathy of prematurity (ROP) is a proliferative vascular disease, which is one of the most dangerous and severe ocular complications in premature infants. Automatic ROP detection system can assist ophthalmologists in the diagnosis of ROP, which is safe, objective, and cost-effective. Unfortunately, due to the large local redundancy and the complex global dependencies in medical image processing, it is challenging to learn the discriminative representation from ROP-related fundus images. To bridge this gap, a novel attention-awareness and deep supervision based network (ADS-Net) is proposed to detect the existence of ROP (Normal or ROP) and 3-level ROP grading (Mild, Moderate, or Severe). First, to balance the problems of large local redundancy and complex global dependencies in images, we design a multi-semantic feature aggregation (MsFA) module based on self-attention mechanism to take full advantage of convolution and self-attention, generating attention-aware expressive features. Then, to solve the challenge of difficult training of deep model and further improve ROP detection performance, we propose an optimization strategy with deeply supervised loss. Finally, the proposed ADS-Net is evaluated on ROP screening and grading tasks with per-image and per-examination strategies, respectively. In terms of per-image classification pattern, the proposed ADS-Net achieves 0.9552 and 0.9037 for Kappa index in ROP screening and grading, respectively. Experimental results demonstrate that the proposed ADS-Net generally outperforms other state-of-the-art classification networks, showing the effectiveness of the proposed method.

PubMed Disclaimer

Conflict of interest statement

The authors declare that there are no conflicts of interest related to this article.

Figures

Fig. 1.
Fig. 1.
Examples of normal and stage 1 to 5. The ROP lesion areas are in red boxes. (a) Normal. (b) stage 1. (c) stage 2. (d) stage 3. (e) stage 4. (f) stage 5.
Fig. 2.
Fig. 2.
An overview of the proposed ADS-Net based ROP detection framework. The ADS-Net consists of a feature extractor and three classifiers, where two auxiliary classifiers is in red and blue dotted boxes and a master classifier is in green dotted box. In addition, ‘MP’, ‘AAP’, ‘fc’ and ‘softmax’ represent max pooling operator, adaptive average pooling operator, fully connected operator, and Softmax activation layer, while ‘L’ represents multiple stacked dense connection module, ‘P’ represents the predicted classification results and ‘MsFA’ represents the proposed multi-semantic feature aggregation module as shown in Fig. 3.
Fig. 3.
Fig. 3.
Multi-semantic feature aggregation (MsFA) module. ‘ F1 ’ and ‘ F2 ’ represent two different input features of MsFA module, where ‘ F1 ’ and ‘ F2 ’ come from ‘ L3 ’ and ‘ L4 ’ in Fig. 2, respectively. ‘E’, ‘ FT ’ and ‘ Ff ’ represent the similarity matrix, spatial response matrix and output feature, respectively. ‘T’ is a transform operation, which is obtained by a 1 × 1 convolution as shown in Eq. (1). In addition, ‘Q’, ‘K’ and ‘V’ are similar to the three branches of self-attention mechanism (query, key and value), which are realized by three 1 × 1 convolutions as shown in Eq. (2), (3) and (4).

References

    1. Chen J., Smith L. E. H., “Retinopathy of prematurity,” Angiogenesis 10(2), 133–140 (2007).10.1007/s10456-007-9066-0 - DOI - PubMed
    1. Kim S. J., Port A. D., Swan R., Campbell J. P., Chan R. V. P., Chiang M. F., “Retinopathy of prematurity: a review of risk factors and their clinical significance,” Surv. Ophthalmol. 63(5), 618–637 (2018).10.1016/j.survophthal.2018.04.002 - DOI - PMC - PubMed
    1. Zhang Y., Zhang G., “A Domain-Specific Terminology for Retinopathy of Prematurity and Its Applications in Clinical Settings,” J. Healthcare Eng. 2018(2018), 1–6 (2018).10.1155/2018/9237319 - DOI - PMC - PubMed
    1. Li Q., Wang Z., Wang R., Tang H., Chen H., Feng Z., “A prospective study of the incidence of retinopathy of prematurity in China: evaluation of different screening criteria,” J. Ophthalmology 2016(2016), 1–8 (2016).10.1155/2016/5918736 - DOI - PMC - PubMed
    1. Blencowe H., Lawn J. E., Vazquez T., Fielder A., Gilbert C., “Preterm-associated visual impairment and estimates of retinopathy of prematurity at regional and global levels for 2010,” Pediatr Res 74(S1), 35–49 (2013).10.1038/pr.2013.205 - DOI - PMC - PubMed

LinkOut - more resources