Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2024 Oct 24;14(21):3059.
doi: 10.3390/ani14213059.

The Posture Detection Method of Caged Chickens Based on Computer Vision

Affiliations

The Posture Detection Method of Caged Chickens Based on Computer Vision

Cheng Fang et al. Animals (Basel). .

Abstract

At present, raising caged chickens is a common farming method in China. However, monitoring the status of caged chickens is still done by human labor, which is time-consuming and laborious. This paper proposed a posture detection method for caged chickens based on computer vision, which can automatically identify the standing and lying posture of chickens in a cage. For this aim, an image correction method was used to rotate the image and make the feeding trough horizontal in the image. The variance method and the speeded-up robust features method were proposed to identify the feeding trough and indirectly obtain the key area through the feeding trough position. In this paper, a depth camera was used to generate three-dimensional information so that it could extract the chickens from the image of the key area. After some constraint conditions, the chickens' postures were screened. The experimental results show that the algorithm can achieve 97.80% precision and 80.18% recall (IoU > 0.5) for white chickens and can achieve 79.52% precision and 81.07% recall (IoU > 0.5) for jute chickens (yellow and black feathers). It runs at ten frames per second on an i5-8500 CPU. Overall, the results indicated that this study provides a non-invasive method for the analysis of posture in caged chickens, which may be helpful for future research on poultry.

Keywords: caged chicken; computer vision; depth image; posture detection; smart agriculture.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflicts of interest.

Figures

Figure 1
Figure 1
(a) Female and (b) male K90 jute broilers..
Figure 2
Figure 2
The production base in Gaoming District, Foshan City, Guangdong Province.
Figure 3
Figure 3
Image acquisition. The color image is divided into three single-channel images of the R component, the G component, and the B component, and the three components are merged with the deep image into a four-channel image for storage.
Figure 4
Figure 4
Arrow A stands for the feeding trough recognition and distance calculation between the camera and the feeding trough; Arrow B represents the key area location and the prostrate broiler recognition within this area.
Figure 5
Figure 5
Algorithm flow chart.
Figure 6
Figure 6
Rotation of the original image for correction. (a,c) The color and depth images before correction; (b,d) the color and depth images after correction; (e) the effect of under edge detection; (f) the effect of morphological processing; (g) the effect of the first Hough transform; and (h) the final effect.
Figure 7
Figure 7
Trough location. (a) The variance method to find the trough area; (b,c) The SURF method to find the trough area.
Figure 8
Figure 8
Key area location. (a) The area where the trough is located is shown with a red vertical line segment in the depth image; (b) the area where the trough is located is shown with a red vertical line segment in the color image, and the key area is marked with a red rectangle; (c) the image of the key area after extraction.
Figure 9
Figure 9
Posture identification algorithm steps. (a) The process of extracting the object from the depth image through distance information, wherein the rectangle ABGJ was the identified trough and the rectangle GJKN was the key area. For example, after the real distance from point D to the camera was obtained in (a), the true distance from all points on HL to the camera were obtained to compare with the real distance of point D. If it satisfied Equation (3), it was considered that the point on the line segment HL was the foreground point. Otherwise, it was the background point. Similarly, after obtaining the true distance from point E to the camera, all the points on the line segment IM were judged to segment the image. (b) The result of segmentation of the image; (c) the result of the morphology process; (d) the result after deleting columns with fewer white pixels; (e) the result of combining multiple contours with close distances; and (f) the final result of identifying the broiler in a lying posture.
Figure 10
Figure 10
Combining objects. (a) The result after performing step d6; (b) the result of directly performing step d8, skipping step d7 based on (a); (c) the result of performing step d9 based on (b); (d) the result after performing step d7 based on (a); (e) the result after performing step d8 based on (d); and (f) the result after performing step d9 based on (e).
Figure 11
Figure 11
Finding a satisfactory rectangular object. (a) The result after performing step d5; (b) the design of the pillar; (c) the result after performing step d8; and (d) the result after performing step d9.
Figure 12
Figure 12
Comparison of the key area location algorithm. F1 means F1 score. (a) An algorithm test for the white broiler house and (b) an algorithm test for the jute broiler house.
Figure 13
Figure 13
The influence of the parameters k1, k2, k3, k4, and k5 on the detection effect of the algorithm. Blue indicates the detection effect of the white broiler house, and red indicates the detection effect of the jute broiler house. (ae) The correlation between one of the parameters and the F1 score while keeping the other four parameters optimal (k1 = 0.6, k2 = 65, k3 = 250, k4 = 2.0, k5 = 1.7); (f) the F1 score at different IoUs when all five parameters were optimal.
Figure 14
Figure 14
Algorithm stability test at large tilt angles. (a) The original color image; (b) the image after correction; and (c) the effect of image correction on the lying broiler identification.
Figure 15
Figure 15
Algorithm stability test under different ambient light. (a,b) Cages that were not exposed to sunlight; (c,d) cages that were exposed to direct sunlight near the window; (a,c) the effect on the key area location; and (b,d) the effect on the lying broiler identification.

References

    1. Majewski E., Potori N., Sulewski P., Wąs A., Mórawska M., Gębska M., Malak-Rawlikowska A., Grontkowska A., Szili V., Erdős A. End of the Cage Age? A Study on the Impacts of the Transition from Cages on the EU Laying Hen Sector. Agriculture. 2024;14:111. doi: 10.3390/agriculture14010111. - DOI
    1. Wang L., Zhang Y., Kong L., Wang Z., Bai H., Jiang Y., Bi Y., Chang G., Chen G. Effects of rearing system (floor vs. cage) and sex on performance, meat quality and enteric microorganism of yellow feather broilers. J. Integr. Agric. 2021;20:1907–1920. doi: 10.1016/S2095-3119(20)63420-7. - DOI
    1. Wang Y., Ru Y., Liu G., Chang W., Zhang S., Yan H., Zheng A., Lou R., Liu Z., Cai H. Effects of different rearing systems on growth performance, nutrients digestibility, digestive organ weight, carcass traits, and energy utilization in male broiler chickens. Livest. Sci. 2015;176:135–140. doi: 10.1016/j.livsci.2015.03.010. - DOI
    1. Sun Z., Zhang M., Liu J., Wang J., Wu Q., Wang G. Research on white feather broiler health monitoring method based on sound detection and transfer learning. Comput. Electron. Agric. 2023;214:108319. doi: 10.1016/j.compag.2023.108319. - DOI
    1. Luo S., Ma Y., Jiang F., Wang H., Tong Q., Wang L. Dead Laying Hens Detection Using TIR-NIR-Depth Images and Deep Learning on a Commercial Farm. Animals. 2023;13:1861. doi: 10.3390/ani13111861. - DOI - PMC - PubMed

LinkOut - more resources