Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2022 Jun 15;24(6):831.
doi: 10.3390/e24060831.

Colored Texture Analysis Fuzzy Entropy Methods with a Dermoscopic Application

Affiliations

Colored Texture Analysis Fuzzy Entropy Methods with a Dermoscopic Application

Mirvana Hilal et al. Entropy (Basel). .

Abstract

Texture analysis is a subject of intensive focus in research due to its significant role in the field of image processing. However, few studies focus on colored texture analysis and even fewer use information theory concepts. Entropy measures have been proven competent for gray scale images. However, to the best of our knowledge, there are no well-established entropy methods that deal with colored images yet. Therefore, we propose the recent colored bidimensional fuzzy entropy measure, FuzEnC2D, and introduce its new multi-channel approaches, FuzEnV2D and FuzEnM2D, for the analysis of colored images. We investigate their sensitivity to parameters and ability to identify images with different irregularity degrees, and therefore different textures. Moreover, we study their behavior with colored Brodatz images in different color spaces. After verifying the results with test images, we employ the three methods for analyzing dermoscopic images of malignant melanoma and benign melanocytic nevi. FuzEnC2D, FuzEnV2D, and FuzEnM2D illustrate a good differentiation ability between the two-similar in appearance-pigmented skin lesions. The results outperform those of a well-known texture analysis measure. Our work provides the first entropy measure studying colored images using both single and multi-channel approaches.

Keywords: colored texture analysis; dermoscopy; entropy; fuzzy entropy; information theory; medical image analysis; melanoma; texture analysis.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
Illustration for FuzEnC2D of an RGB color space image. (a) The image U is split into its corresponding channels UR, UG, and UB, respectively, from left to right; (b) the embedding dimension pattern of size m×m having m=[2,2]; (c) Xi,j,Km and Xa,b,Km for K = K1, K2, and K3 being the R, G, and B color channels, respectively.
Figure 2
Figure 2
Illustration for FuzEnV2D of an RGB color space image having m = [ 2,2,2]. (a) A portion of the colored image U with its R, G, and B channels; (b) the scanning pattern or embedding dimension with m=[2,2,2] that is a 2×2×2 cube; (c) Xi,j,km and Xa,b,cm, the fixed and moving templates defined above.
Figure 3
Figure 3
Illustration for FuzEnM2D of RGB color space image having m=[2,2,3]. (a) A portion of the colored image U with its R, G, and B channels; (b) the scanning pattern or embedding dimension with m=[2,2,3] that is a 2×2×3 cuboid; (c) the fixed and moving templates defined above.
Figure 4
Figure 4
Colored Brodatz texture (CBT) images of different colored irregularity degrees [41,42]. (ai) CBT images that are used for the validation test (Section 4.3) to compare the entropy values of each colored texture to its corresponding sub-images in three color spaces (RGB, HSV, and YUV); (f) is used again for studying the sensitivity of the proposed measures to different initial parameters (Section 4.1).
Figure 5
Figure 5
Dermoscopic images segmentation for choosing the region of interest (ROI). (a) an example of the dermoscopic image for a pigmented skin lesion; (b,c) the contouring and segmentation of the lesion; (d) the ROI as the central 128×128×3 pixels.
Figure 6
Figure 6
FuzEnC2D results for the red, green, and blue channels (left to right) of the colored Brodatz image, Figure 4f, with varying r and m.
Figure 7
Figure 7
FuzEnV2D results with varying r and m of the colored Brodatz image, Figure 4f.
Figure 8
Figure 8
FuzEnM2D results with varying r and m of the colored Brodatz image, Figure 4f.
Figure 9
Figure 9
FuzEnC2D mean and standard deviation for MIX2D(p) images with 10 repetitions.
Figure 10
Figure 10
FuzEnV2D mean and standard deviation for MIX3D(p) images with 10 repetitions.
Figure 11
Figure 11
FuzEnM2D mean and standard deviation for MIX3D(p) images.
Figure 12
Figure 12
FuzEnC2D results for the 144 sub-images and 300 × 300 pixels of the CBT in the three color spaces: RGB, HSV, and YUV, with K1, K2, and K3 being the first, second, and third channel, respectively. The mean of the 144 sub-images is displayed as a “∘” sign and the value for the 300 × 300 pixels is displayed as “*”.
Figure 13
Figure 13
FuzEnV2D results for the 144 sub-images and 300 × 300 pixels of the CBT in the three color spaces: RGB, HSV, and YUV. The mean of the 144 sub-images is displayed as a “∘” sign and the value for the 300 × 300 pixels is displayed as “*”.
Figure 14
Figure 14
FuzEnV2D and Haralick feature p-values of 40 melanoma and 40 melanocytic nevi dermoscopic images in the 3 color spaces: RGB, HSV, and YUV. d represents the inter-pixel distances for the co-occurrence matrices.
Figure 15
Figure 15
FuzEnM2D and Haralick feature p-values of 40 melanoma and 40 melanocytic nevi dermoscopic images in the 3 color spaces: RGB, HSV, and YUV. d represents the inter-pixel distances for the co-occurrence matrices.
Figure 16
Figure 16
ROC curves for FuzEnC2D results of the 40 melanoma and 40 melanocytic nevi images in the RGB color space. The curves are for FuzEnCR2D, FuzEnCG2D, and FuzEnCB2D from left to right.
Figure 17
Figure 17
ROC curves for FuzEnV2D results of the 40 melanoma and 40 melanocytic nevi images in the RGB color space.
Figure 18
Figure 18
ROC curves for FuzEnM2D results of the 40 melanoma and 40 melanocytic nevi images in the RGB color space.

Similar articles

Cited by

References

    1. Humeau-Heurtier A. Texture feature extraction methods: A survey. IEEE Access. 2019;7:8975–9000. doi: 10.1109/ACCESS.2018.2890743. - DOI
    1. Song T., Feng J., Wang S., Xie Y. Spatially weighted order binary pattern for color texture classification. Expert Syst. Appl. 2020;147:113167. doi: 10.1016/j.eswa.2019.113167. - DOI
    1. Liu L., Chen J., Fieguth P., Zhao G., Chellappa R., Pietikäinen M. From BoW to CNN: Two decades of texture representation for texture classification. Int. J. Comput. Vis. 2019;127:74–109. doi: 10.1007/s11263-018-1125-z. - DOI
    1. Liu L., Fieguth P., Guo Y., Wang X., Pietikäinen M. Local binary features for texture classification: Taxonomy and experimental study. Pattern Recognit. 2017;62:135–160. doi: 10.1016/j.patcog.2016.08.032. - DOI
    1. Nguyen T.P., Vu N.S., Manzanera A. Statistical binary patterns for rotational invariant texture classification. Neurocomputing. 2016;173:1565–1577. doi: 10.1016/j.neucom.2015.09.029. - DOI

LinkOut - more resources