Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2023 Jul 5;10(7):807.
doi: 10.3390/bioengineering10070807.

Performance Comparison of Object Detection Networks for Shrapnel Identification in Ultrasound Images

Affiliations

Performance Comparison of Object Detection Networks for Shrapnel Identification in Ultrasound Images

Sofia I Hernandez-Torres et al. Bioengineering (Basel). .

Abstract

Ultrasound imaging is a critical tool for triaging and diagnosing subjects but only if images can be properly interpreted. Unfortunately, in remote or military medicine situations, the expertise to interpret images can be lacking. Machine-learning image interpretation models that are explainable to the end user and deployable in real time with ultrasound equipment have the potential to solve this problem. We have previously shown how a YOLOv3 (You Only Look Once) object detection algorithm can be used for tracking shrapnel, artery, vein, and nerve fiber bundle features in a tissue phantom. However, real-time implementation of an object detection model requires optimizing model inference time. Here, we compare the performance of five different object detection deep-learning models with varying architectures and trainable parameters to determine which model is most suitable for this shrapnel-tracking ultrasound image application. We used a dataset of more than 16,000 ultrasound images from gelatin tissue phantoms containing artery, vein, nerve fiber, and shrapnel features for training and evaluating each model. Every object detection model surpassed 0.85 mean average precision except for the detection transformer model. Overall, the YOLOv7tiny model had the higher mean average precision and quickest inference time, making it the obvious model choice for this ultrasound imaging application. Other object detection models were overfitting the data as was determined by lower testing performance compared with higher training performance. In summary, the YOLOv7tiny object detection model had the best mean average precision and inference time and was selected as optimal for this application. Next steps will implement this object detection algorithm for real-time applications, an important next step in translating AI models for emergency and military medicine.

Keywords: artificial intelligence; deep learning; image interpretation; machine learning; neurovascular; object detection; shrapnel; ultrasound imaging.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
Representative phantom images from each labeling category. Tissue phantom ultrasound images were divided into four categories: (A) Baseline simple phantom, no shrapnel or neurovascular features. (B) Shrapnel in the simple phantom, without neurovascular features. (C) Baseline complex phantom with vein, artery, and nerve objects. (D) Shrapnel in the complex phantom with vein, artery, and nerve objects. Representative US pictures were increased in brightness for ease of image interpretation.
Figure 2
Figure 2
Visual description of intersection over union and average precision metrics. (A) Intersection over union (IoU) is calculated by dividing the intersection (pattern region in the numerator) of the ground-truth and predicted bounding boxes by the union of the two (area of both boxes combined, shown in denominator). (B) Representative ultrasound image showing a predicted (darker yellow) and ground-truth (lighter yellow) bounding box around a shrapnel object, with an IoU score of 0.66. The representative US image was increased in brightness for ease of interpretation. (C) A precision-recall curve from which we obtain the average precision (AP) by calculating area under the curve; in the example shown, the AP is 0.88. To generate this curve, precision is calculated as the total of true positives divided by all the positive predictions and recall as the total of true positives divided by the true positives and false negatives, for multiple classifier thresholds.
Figure 3
Figure 3
Representative test predictions using YOLOv7tiny. The three images are example object detection predictions for shrapnel (yellow), vein (blue), artery (red), and nerve fiber (green) taken from YOLOv7tiny. Images are shown for each of three categories: (A) shrapnel in simple phantom, with neurovascular features; (B) baseline complex phantom, with vein artery and nerve; and (C) shrapnel in complex phantom. The numbers adjacent to the predictions represent the confidence measure of that prediction.

Similar articles

Cited by

References

    1. Gil-Rodríguez J., Pérez de Rojas J., Aranda-Laserna P., Benavente-Fernández A., Martos-Ruiz M., Peregrina-Rivas J.-A., Guirao-Arrabal E. Ultrasound Findings of Lung Ultrasonography in COVID-19: A Systematic Review. Eur. J. Radiol. 2022;148:110156. doi: 10.1016/j.ejrad.2022.110156. - DOI - PMC - PubMed
    1. European Society of Radiology (ESR) The Role of Lung Ultrasound in COVID-19 Disease. Insights Imaging. 2021;12:81. doi: 10.1186/s13244-021-01013-6. - DOI - PMC - PubMed
    1. Wang X., Yang M. The Application of Ultrasound Image in Cancer Diagnosis. J. Healthc. Eng. 2021;2021:8619251. doi: 10.1155/2021/8619251. - DOI - PMC - PubMed
    1. Zhang G., Ye H.-R., Sun Y., Guo Z.-Z. Ultrasound Molecular Imaging and Its Applications in Cancer Diagnosis and Therapy. ACS Sens. 2022;7:2857–2864. doi: 10.1021/acssensors.2c01468. - DOI - PubMed
    1. Marin J., Abo A., Doniger S., Fischer J., Kessler D., Levy J., Noble V., Sivitz A., Tsung J., Vieira R., et al. Point-of-care ultrasonography by pediatric emergency physicians. Ann. Emerg. Med. 2015;65:472–478. doi: 10.1016/j.annemergmed.2015.01.028. - DOI - PubMed

LinkOut - more resources