Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2024 Jun 26;24(13):4135.
doi: 10.3390/s24134135.

Lizard Body Temperature Acquisition and Lizard Recognition Using Artificial Intelligence

Affiliations

Lizard Body Temperature Acquisition and Lizard Recognition Using Artificial Intelligence

Ana L Afonso et al. Sensors (Basel). .

Abstract

The acquisition of the body temperature of animals kept in captivity in biology laboratories is crucial for several studies in the field of animal biology. Traditionally, the acquisition process was carried out manually, which does not guarantee much accuracy or consistency in the acquired data and was painful for the animal. The process was then switched to a semi-manual process using a thermal camera, but it still involved manually clicking on each part of the animal's body every 20 s of the video to obtain temperature values, making it a time-consuming, non-automatic, and difficult process. This project aims to automate this acquisition process through the automatic recognition of parts of a lizard's body, reading the temperature in these parts based on a video taken with two cameras simultaneously: an RGB camera and a thermal camera. The first camera detects the location of the lizard's various body parts using artificial intelligence techniques, and the second camera allows reading of the respective temperature of each part. Due to the lack of lizard datasets, either in the biology laboratory or online, a dataset had to be created from scratch, containing the identification of the lizard and six of its body parts. YOLOv5 was used to detect the lizard and its body parts in RGB images, achieving a precision of 90.00% and a recall of 98.80%. After initial calibration, the RGB and thermal camera images are properly localised, making it possible to know the lizard's position, even when the lizard is at the same temperature as its surrounding environment, through a coordinate conversion from the RGB image to the thermal image. The thermal image has a colour temperature scale with the respective maximum and minimum temperature values, which is used to read each pixel of the thermal image, thus allowing the correct temperature to be read in each part of the lizard.

Keywords: YOLO; artificial intelligence; body temperature acquisition; computer vision; lizards; object detection.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflicts of interest.

Figures

Figure 1
Figure 1
Architecture diagram of YOLOv5s.
Figure 2
Figure 2
Scenario: (a) setup and (b) cardboard box used.
Figure 3
Figure 3
Example of a labelled dataset image in Roboflow.
Figure 4
Figure 4
The scenario used to obtain thermal images and their associated RGB images.
Figure 5
Figure 5
RGB camera output (left side) and thermal camera output (right side).
Figure 6
Figure 6
Detection of the lizard and its six body parts on the ROI (left).
Figure 7
Figure 7
All pixels within the (a) “Lizard” and (b) “Tail” bounding boxes are white.
Figure 8
Figure 8
Isolation of the ROI defined by the (a) “Lizard” and (b) “Tail” bounding boxes.
Figure 9
Figure 9
The (a) “Lizard” and (b) “Tail” bounding boxes are in grayscale.
Figure 10
Figure 10
The (a) “Lizard” and (b) “Tail” bounding boxes are in black and white (binary).
Figure 11
Figure 11
Bounding boxes and their respective central pixels are represented by a blue circle (in the ROI).
Figure 12
Figure 12
The initial pixel (centre) represents the “Lizard” (blue rectangle) and “Tail” (yellow rectangle) bounding boxes, marked with a blue dot. The final pixel representative of each bounding box is marked with a green dot.
Figure 13
Figure 13
The “src” parameter is represented by the coordinates of points 1, 2, 3, and 4, and the “dst” parameter is represented by the coordinates of points 1′, 2′, 3′, and 4′.
Figure 14
Figure 14
Bounding boxes and their representative pixels marked with blue dots (RGB image) and corresponding pixels marked with red dots in the thermal image.
Figure 15
Figure 15
Annotation of the maximum Y (Ymax), minimum Y (Ymin), and median X (Xmed) relative to the coordinate axis of the input image.
Figure 16
Figure 16
Resulting graphs after training using a batch size of 32 and 500 epochs.
Figure 17
Figure 17
F1–confidence curve.
Figure 18
Figure 18
Precision–recall curve.
Figure 19
Figure 19
Example of an image from the “test set” with predictions.
Figure 20
Figure 20
Example of an image from the “test set” with noise and predictions.
Figure 21
Figure 21
Notepad with date, hour, class, and temperature values obtained: (a) from an image and (b) from a video.

References

    1. Jumper J., Evans R., Pritzel A., Green T., Figurnov M., Ronneberger O., Tunyasuvunakool K., Bates R., Žídek A., Potapenko A., et al. Highly Accurate Protein Structure Prediction with AlphaFold. Nature. 2021;596:583–589. doi: 10.1038/s41586-021-03819-2. - DOI - PMC - PubMed
    1. Jiménez-Luna J., Grisoni F., Schneider G. Drug Discovery with Explainable Artificial Intelligence. Nat. Mach. Intell. 2020;2:573–584. doi: 10.1038/s42256-020-00236-4. - DOI
    1. Buchelt A., Adrowitzer A., Kieseberg P., Gollob C., Nothdurft A., Eresheim S., Tschiatschek S., Stampfer K., Holzinger A. Exploring Artificial Intelligence for Applications of Drones in Forest Ecology and Management. For. Ecol. Manag. 2024;551:121530. doi: 10.1016/j.foreco.2023.121530. - DOI
    1. Hurwitz J., Kirsch D. Machine Learning for Dummies. IBM Limited Edition; John Wiley & Sons; Indianapolis, IN, USA: 2018. Understanding Machine Learning; pp. 3–17.
    1. Mueller J.P., Massaron L. Machine Learning for Dummies. 2nd ed. John Wiley & Sons; Indianapolis, IN, USA: 2021. Descending the Gradient; pp. 139–151.

MeSH terms