Dead Laying Hens Detection Using TIR-NIR-Depth Images and Deep Learning on a Commercial Farm
- PMID: 37889777
- PMCID: PMC10251900
- DOI: 10.3390/ani13111861
Dead Laying Hens Detection Using TIR-NIR-Depth Images and Deep Learning on a Commercial Farm
Abstract
In large-scale laying hen farming, timely detection of dead chickens helps prevent cross-infection, disease transmission, and economic loss. Dead chicken detection is still performed manually and is one of the major labor costs on commercial farms. This study proposed a new method for dead chicken detection using multi-source images and deep learning and evaluated the detection performance with different source images. We first introduced a pixel-level image registration method that used depth information to project the near-infrared (NIR) and depth image into the coordinate of the thermal infrared (TIR) image, resulting in registered images. Then, the registered single-source (TIR, NIR, depth), dual-source (TIR-NIR, TIR-depth, NIR-depth), and multi-source (TIR-NIR-depth) images were separately used to train dead chicken detecting models with object detection networks, including YOLOv8n, Deformable DETR, Cascade R-CNN, and TOOD. The results showed that, at an IoU (Intersection over Union) threshold of 0.5, the performance of these models was not entirely the same. Among them, the model using the NIR-depth image and Deformable DETR achieved the best performance, with an average precision (AP) of 99.7% (IoU = 0.5) and a recall of 99.0% (IoU = 0.5). While the IoU threshold increased, we found the following: The model with the NIR image achieved the best performance among models with single-source images, with an AP of 74.4% (IoU = 0.5:0.95) in Deformable DETR. The performance with dual-source images was higher than that with single-source images. The model with the TIR-NIR or NIR-depth image outperformed the model with the TIR-depth image, achieving an AP of 76.3% (IoU = 0.5:0.95) and 75.9% (IoU = 0.5:0.95) in Deformable DETR, respectively. The model with the multi-source image also achieved higher performance than that with single-source images. However, there was no significant improvement compared to the model with the TIR-NIR or NIR-depth image, and the AP of the model with multi-source image was 76.7% (IoU = 0.5:0.95) in Deformable DETR. By analyzing the detection performance with different source images, this study provided a reference for selecting and using multi-source images for detecting dead laying hens on commercial farms.
Keywords: dead laying hen detection; deep learning; depth image; image registration; large-scale farming; near-infrared image; thermal infrared image.
Conflict of interest statement
The authors declare no conflict of interest.
Figures












References
-
- Li B., Wang Y., Zheng W., Tong Q. Research progress in environmental control key technologies, facilities and equipment for laying hen production in China. Trans. Chin. Soc. Agric. Eng. 2020;36:212–221. doi: 10.11975/j.issn.1002-6819.2020.16.026. - DOI
-
- Lu C., Zhu W., Pu X. Preliminary report on dead chicken detection system in chicken farm. China Poult. 2008;30:39–40. doi: 10.16372/j.issn.1004-6364.2008.21.032. (In Chinese with English Abstract) - DOI
-
- Zhu W., Peng Y., Ji B. An automatic dead chicken detection algorithm based on svm in modern chicken farm; Proceedings of the 2009 Second International Symposium on Information Science and Engineering; Shanghai, China. 26–28 December 2009; pp. 323–326. - DOI
-
- Peng Y. Master’s Dissertation. Jiangsu University; Jiangsu, China: 2010. Research on Detection Method of Dead Chickens in Chicken Farm Based on Support Vector Machine. (In Chinese with English Abstract)
Grants and funding
LinkOut - more resources
Full Text Sources
Miscellaneous