Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2021 Apr 8;21(8):2625.
doi: 10.3390/s21082625.

Visibility Restoration: A Systematic Review and Meta-Analysis

Affiliations
Review

Visibility Restoration: A Systematic Review and Meta-Analysis

Dat Ngo et al. Sensors (Basel). .

Abstract

Image acquisition is a complex process that is affected by a wide variety of internal and environmental factors. Hence, visibility restoration is crucial for many high-level applications in photography and computer vision. This paper provides a systematic review and meta-analysis of visibility restoration algorithms with a focus on those that are pertinent to poor weather conditions. This paper starts with an introduction to optical image formation and then provides a comprehensive description of existing algorithms as well as a comparative evaluation. Subsequently, there is a thorough discussion on current difficulties that are worthy of a scientific effort. Moreover, this paper proposes a general framework for visibility restoration in hazy weather conditions while using haze-relevant features and maximum likelihood estimates. Finally, a discussion on the findings and future developments concludes this paper.

Keywords: haze removal; image defogging; image dehazing; meta-analysis; systematic review; visibility enhancement.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
Typical example of digital camera workflow.
Figure 2
Figure 2
PRISMA flow diagram for the systematic review in this study.
Figure 3
Figure 3
Visual illustration of the optical image formation in the atmosphere.
Figure 4
Figure 4
General classification of visibility restoration algorithms.
Figure 5
Figure 5
Simplified block diagrams of: (a) contrast enhancement and (b) polarimetric dehazing approaches in visibility restoration.
Figure 6
Figure 6
Simplified block diagram of dark channel prior-based visibility restoration methods.
Figure 7
Figure 7
Simplified block diagram of image fusion-based visibility restoration methods.
Figure 8
Figure 8
Branching diagram summarizing image-processing-based dehazing algorithms.
Figure 9
Figure 9
Simplified block diagram summarizing regression and regularization-based visibility restoration methods.
Figure 10
Figure 10
Branching diagram summarizing machine-learning-based dehazing algorithms.
Figure 11
Figure 11
Simplified block diagram of convolutional neural network-based visibility restoration algorithms.
Figure 12
Figure 12
Simplified block diagram of generative adversarial network-based visibility restoration algorithms.
Figure 13
Figure 13
Branching diagram summarizing deep-learning-based dehazing algorithms.
Figure 14
Figure 14
Typical procedure for preparing the synthetic training dataset.
Figure 15
Figure 15
Proposed dehazing framework based on a machine learning technique.
Figure 16
Figure 16
Selection of representative hazy patches based on: (a) mean subtracted contrast normalized, (b) sharpness, (c) contrast, (d) entropy, (e) dark channel prior, and (f) saturation features. (g) Selected patches.
Figure 17
Figure 17
Selection of representative haze-free patches based on: (a) mean subtracted contrast normalized, (b) sharpness, (c) contrast, (d) entropy, (e) dark channel prior, and (f) saturation features. (g) Selected patches.
Figure 18
Figure 18
Atmospheric light estimation procedure utilized by: (a) Zhu et al. [52]–red pixels belongs to the top 0.1% brightest pixels, and (b) Park et al. [121]—blue lines represent the quadtree-decomposition process, and the red dot represents the atmospheric light’s estimate.
Figure 19
Figure 19
A qualitative comparison of different dehazing methods on a real hazy image of a train. (a) Hazy image, and results by (b) Tarel and Hautiere [35], (c) He et al. [21], (d) Kim et al. [36], (e) Bui and Kim [50], (f) Zhu et al. [52], (g) Ngo et al. [74], (h) Cai et al. [85], (i) Ren et al. [89], and (j) the proposed framework.
Figure 20
Figure 20
A qualitative comparison of different dehazing methods on a real hazy image of mountains. (a) Hazy image, and results by (b) Tarel and Hautiere [35], (c) He et al. [21], (d) Kim et al. [36], (e) Bui and Kim [50], (f) Zhu et al. [52], (g) Ngo et al. [74], (h) Cai et al. [85], (i) Ren et al. [89], and (j) the proposed framework.
Figure 21
Figure 21
A qualitative comparison of different dehazing methods on a real hazy image of a road scene. (a) Hazy image, and results by (b) Tarel and Hautiere [35], (c) He et al. [21], (d) Kim et al. [36], (e) Bui and Kim [50], (f) Zhu et al. [52], (g) Ngo et al. [74], (h) Cai et al. [85], (i) Ren et al. [89], and (j) the proposed framework.
Figure 22
Figure 22
A qualitative comparison of different dehazing methods on a synthetic hazy image of a road scene. (a) Hazy image, results by (b) Tarel and Hautiere [35], (c) He et al. [21], (d) Kim et al. [36], (e) Bui and Kim [50], (f) Zhu et al. [52], (g) Ngo et al. [74], (h) Cai et al. [85], (i) Ren et al. [89], (j) and the proposed framework, and (k) ground truth.
Figure 23
Figure 23
A qualitative comparison of different dehazing methods on a synthetic hazy image of an indoor scene. (a) Hazy image, results by (b) Tarel and Hautiere [35], (c) He et al. [21], (d) Kim et al. [36], (e) Bui and Kim [50], (f) Zhu et al. [52], (g) Ngo et al. [74], (h) Cai et al. [85], (i) Ren et al. [89], (j) and the proposed framework, and (k) ground truth.

References

    1. Parulski K., Spaulding K. Color image processing for digital cameras. In: Sharma G., editor. Digital Color Imaging Handbook. CRC Press; Boca Raton, FL, USA: 2003. pp. 734–739. Chapter 12.
    1. Oakley J.P., Satherley B.L. Improving image quality in poor visibility conditions using a physical model for contrast degradation. IEEE Trans. Image Process. 1998;7:167–179. doi: 10.1109/83.660994. - DOI - PubMed
    1. Tan K.K., Oakley J.P. Physics-based approach to color image enhancement in poor visibility conditions. J. Opt. Soc. Am. A Opt. Image Sci. Vis. 2001;18:2460–2467. doi: 10.1364/JOSAA.18.002460. - DOI - PubMed
    1. Liu Z., He Y., Wang C., Song R. Analysis of the Influence of Foggy Weather Environment on the Detection Effect of Machine Vision Obstacles. Sensors. 2020;20:349. doi: 10.3390/s20020349. - DOI - PMC - PubMed
    1. Pei Y., Huang Y., Zou Q., Zhang X., Wang S. Effects of Image Degradation and Degradation Removal to CNN-based Image Classification. IEEE Trans. Pattern Anal. Mach. Intell. 2019 doi: 10.1109/TPAMI.2019.2950923. - DOI - PubMed

Grants and funding