Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2009 Sep;10(9):648-58.
doi: 10.1631/jzus.B0930162.

A fast automatic recognition and location algorithm for fetal genital organs in ultrasound images

Affiliations

A fast automatic recognition and location algorithm for fetal genital organs in ultrasound images

Sheng Tang et al. J Zhejiang Univ Sci B. 2009 Sep.

Abstract

Severe sex ratio imbalance at birth is now becoming an important issue in several Asian countries. Its leading immediate cause is prenatal sex-selective abortion following illegal sex identification by ultrasound scanning. In this paper, a fast automatic recognition and location algorithm for fetal genital organs is proposed as an effective method to help prevent ultrasound technicians from unethically and illegally identifying the sex of the fetus. This automatic recognition algorithm can be divided into two stages. In the 'rough' stage, a few pixels in the image, which are likely to represent the genital organs, are automatically chosen as points of interest (POIs) according to certain salient characteristics of fetal genital organs. In the 'fine' stage, a specifically supervised learning framework, which fuses an effective feature data preprocessing mechanism into the multiple classifier architecture, is applied to every POI. The basic classifiers in the framework are selected from three widely used classifiers: radial basis function network, backpropagation network, and support vector machine. The classification results of all the POIs are then synthesized to determine whether the fetal genital organ is present in the image, and to locate the genital organ within the positive image. Experiments were designed and carried out based on an image dataset comprising 658 positive images (images with fetal genital organs) and 500 negative images (images without fetal genital organs). The experimental results showed true positive (TP) and true negative (TN) results from 80.5% (265 from 329) and 83.0% (415 from 500) of samples, respectively. The average computation time was 453 ms per image.

PubMed Disclaimer

Figures

Fig.1
Fig.1
US images containing fetal genital organs (highlighted with a dashed square) (a) Male at 20 weeks; (b) Female at 23 weeks; (c) Female at 25 weeks; (d) Male at 25 weeks; (e) Male at 27 weeks; (f) Male at 27 weeks; (g) Female at 31 weeks; (h) Male at 34 weeks
Fig.1
Fig.1
US images containing fetal genital organs (highlighted with a dashed square) (a) Male at 20 weeks; (b) Female at 23 weeks; (c) Female at 25 weeks; (d) Male at 25 weeks; (e) Male at 27 weeks; (f) Male at 27 weeks; (g) Female at 31 weeks; (h) Male at 34 weeks
Fig.1
Fig.1
US images containing fetal genital organs (highlighted with a dashed square) (a) Male at 20 weeks; (b) Female at 23 weeks; (c) Female at 25 weeks; (d) Male at 25 weeks; (e) Male at 27 weeks; (f) Male at 27 weeks; (g) Female at 31 weeks; (h) Male at 34 weeks
Fig.1
Fig.1
US images containing fetal genital organs (highlighted with a dashed square) (a) Male at 20 weeks; (b) Female at 23 weeks; (c) Female at 25 weeks; (d) Male at 25 weeks; (e) Male at 27 weeks; (f) Male at 27 weeks; (g) Female at 31 weeks; (h) Male at 34 weeks
Fig.1
Fig.1
US images containing fetal genital organs (highlighted with a dashed square) (a) Male at 20 weeks; (b) Female at 23 weeks; (c) Female at 25 weeks; (d) Male at 25 weeks; (e) Male at 27 weeks; (f) Male at 27 weeks; (g) Female at 31 weeks; (h) Male at 34 weeks
Fig.1
Fig.1
US images containing fetal genital organs (highlighted with a dashed square) (a) Male at 20 weeks; (b) Female at 23 weeks; (c) Female at 25 weeks; (d) Male at 25 weeks; (e) Male at 27 weeks; (f) Male at 27 weeks; (g) Female at 31 weeks; (h) Male at 34 weeks
Fig.1
Fig.1
US images containing fetal genital organs (highlighted with a dashed square) (a) Male at 20 weeks; (b) Female at 23 weeks; (c) Female at 25 weeks; (d) Male at 25 weeks; (e) Male at 27 weeks; (f) Male at 27 weeks; (g) Female at 31 weeks; (h) Male at 34 weeks
Fig.1
Fig.1
US images containing fetal genital organs (highlighted with a dashed square) (a) Male at 20 weeks; (b) Female at 23 weeks; (c) Female at 25 weeks; (d) Male at 25 weeks; (e) Male at 27 weeks; (f) Male at 27 weeks; (g) Female at 31 weeks; (h) Male at 34 weeks
Fig.2
Fig.2
The local speckle statistics computed on different windows
Fig.3
Fig.3
The segmentation results from the eight original US fetal images in Fig.1. Results from (a) Fig.1a; (b) Fig.1b; (c) Fig.1c; (d) Fig.1d; (e) Fig.1e; (f) Fig.1f; (g) Fig.1g; (h) Fig.1h
Fig.3
Fig.3
The segmentation results from the eight original US fetal images in Fig.1. Results from (a) Fig.1a; (b) Fig.1b; (c) Fig.1c; (d) Fig.1d; (e) Fig.1e; (f) Fig.1f; (g) Fig.1g; (h) Fig.1h
Fig.3
Fig.3
The segmentation results from the eight original US fetal images in Fig.1. Results from (a) Fig.1a; (b) Fig.1b; (c) Fig.1c; (d) Fig.1d; (e) Fig.1e; (f) Fig.1f; (g) Fig.1g; (h) Fig.1h
Fig.3
Fig.3
The segmentation results from the eight original US fetal images in Fig.1. Results from (a) Fig.1a; (b) Fig.1b; (c) Fig.1c; (d) Fig.1d; (e) Fig.1e; (f) Fig.1f; (g) Fig.1g; (h) Fig.1h
Fig.3
Fig.3
The segmentation results from the eight original US fetal images in Fig.1. Results from (a) Fig.1a; (b) Fig.1b; (c) Fig.1c; (d) Fig.1d; (e) Fig.1e; (f) Fig.1f; (g) Fig.1g; (h) Fig.1h
Fig.3
Fig.3
The segmentation results from the eight original US fetal images in Fig.1. Results from (a) Fig.1a; (b) Fig.1b; (c) Fig.1c; (d) Fig.1d; (e) Fig.1e; (f) Fig.1f; (g) Fig.1g; (h) Fig.1h
Fig.3
Fig.3
The segmentation results from the eight original US fetal images in Fig.1. Results from (a) Fig.1a; (b) Fig.1b; (c) Fig.1c; (d) Fig.1d; (e) Fig.1e; (f) Fig.1f; (g) Fig.1g; (h) Fig.1h
Fig.3
Fig.3
The segmentation results from the eight original US fetal images in Fig.1. Results from (a) Fig.1a; (b) Fig.1b; (c) Fig.1c; (d) Fig.1d; (e) Fig.1e; (f) Fig.1f; (g) Fig.1g; (h) Fig.1h
Fig.4
Fig.4
The selected POIs from Fig.1. Results from (a) Fig.1a; (b) Fig.1b; (c) Fig.1c; (d) Fig.1d; (e) Fig.1e; (f) Fig.1f; (g) Fig.1g; (h) Fig.1h
Fig.4
Fig.4
The selected POIs from Fig.1. Results from (a) Fig.1a; (b) Fig.1b; (c) Fig.1c; (d) Fig.1d; (e) Fig.1e; (f) Fig.1f; (g) Fig.1g; (h) Fig.1h
Fig.4
Fig.4
The selected POIs from Fig.1. Results from (a) Fig.1a; (b) Fig.1b; (c) Fig.1c; (d) Fig.1d; (e) Fig.1e; (f) Fig.1f; (g) Fig.1g; (h) Fig.1h
Fig.4
Fig.4
The selected POIs from Fig.1. Results from (a) Fig.1a; (b) Fig.1b; (c) Fig.1c; (d) Fig.1d; (e) Fig.1e; (f) Fig.1f; (g) Fig.1g; (h) Fig.1h
Fig.4
Fig.4
The selected POIs from Fig.1. Results from (a) Fig.1a; (b) Fig.1b; (c) Fig.1c; (d) Fig.1d; (e) Fig.1e; (f) Fig.1f; (g) Fig.1g; (h) Fig.1h
Fig.4
Fig.4
The selected POIs from Fig.1. Results from (a) Fig.1a; (b) Fig.1b; (c) Fig.1c; (d) Fig.1d; (e) Fig.1e; (f) Fig.1f; (g) Fig.1g; (h) Fig.1h
Fig.4
Fig.4
The selected POIs from Fig.1. Results from (a) Fig.1a; (b) Fig.1b; (c) Fig.1c; (d) Fig.1d; (e) Fig.1e; (f) Fig.1f; (g) Fig.1g; (h) Fig.1h
Fig.4
Fig.4
The selected POIs from Fig.1. Results from (a) Fig.1a; (b) Fig.1b; (c) Fig.1c; (d) Fig.1d; (e) Fig.1e; (f) Fig.1f; (g) Fig.1g; (h) Fig.1h

Similar articles

Cited by

References

    1. Barandela R, Sanchez JS, Garcia V, Rangel E. Strategies for learning in class imbalance problems. Pattern Recogn. 2003;36(3):849–851. doi: 10.1016/S0031-3203(02)00257-1. - DOI
    1. Batista GE, Prati RC, Monard MC. A study of the behavior of several methods for balancing machine learning training data. ACM SIGKDD Explor Newsl. 2004;6(1):20–29. doi: 10.1145/1007730.1007735. - DOI
    1. Bhanu Prakash KN, Ramakrishnan AG, Suresh S, Chow WP. Fetal lung maturity analysis using ultrasound image features. IEEE Trans Inf Technol Biomed. 2002;6(1):38–45. doi: 10.1109/4233.992160. - DOI - PubMed
    1. Bishop CM. Neural Networks for Pattern Recognition. Oxford: Oxford University Press; 1995.
    1. Blum A, Langley P. Selection of relevant features and examples in machine learning. Artif Intell. 1997;97(1-2):245–271. doi: 10.1016/S0004-3702(97)00063-5. - DOI

Publication types