Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2022 Dec 23;12(1):84.
doi: 10.3390/foods12010084.

Double-Camera Fusion System for Animal-Position Awareness in Farming Pens

Affiliations

Double-Camera Fusion System for Animal-Position Awareness in Farming Pens

Shoujun Huo et al. Foods. .

Abstract

In livestock breeding, continuous and objective monitoring of animals is manually unfeasible due to the large scale of breeding and expensive labour. Computer vision technology can generate accurate and real-time individual animal or animal group information from video surveillance. However, the frequent occlusion between animals and changes in appearance features caused by varying lighting conditions makes single-camera systems less attractive. We propose a double-camera system and image registration algorithms to spatially fuse the information from different viewpoints to solve these issues. This paper presents a deformable learning-based registration framework, where the input image pairs are initially linearly pre-registered. Then, an unsupervised convolutional neural network is employed to fit the mapping from one view to another, using a large number of unlabelled samples for training. The learned parameters are then used in a semi-supervised network and fine-tuned with a small number of manually annotated landmarks. The actual pixel displacement error is introduced as a complement to an image similarity measure. The performance of the proposed fine-tuned method is evaluated on real farming datasets and demonstrates significant improvement in lowering the registration errors than commonly used feature-based and intensity-based methods. This approach also reduces the registration time of an unseen image pair to less than 0.5 s. The proposed method provides a high-quality reference processing step for improving subsequent tasks such as multi-object tracking and behaviour recognition of animals for further analysis.

Keywords: convolutional neural network; double-camera; fine-tune; image registration.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
Pig pen layout and lens positions.
Figure 2
Figure 2
Image pairs at various stages of pre-processing.
Figure 3
Figure 3
The proposed method consists of two stages. First, train the unsupervised network with image pairs without landmarks to obtain the weights across pig pens and over time; second, import these weights into the semi-supervised network and fine-tune with samples labelled with landmarks.
Figure 4
Figure 4
The architecture of the proposed network.
Figure 5
Figure 5
Examples of two types of landmarks and their correspondences. Red dots indicate background objects and yellow dots indicate the foreground.
Figure 6
Figure 6
Boxplots of TRE scores of each image registration method for aligning the background and the foreground objects. For each method a cluster of three results are grouped: seen pens, unseen pen, and all pens.
Figure 7
Figure 7
Illustrations of non-linear dense registration effects and errors of un-supervised and fine-tuned networks in surveillance of a pen with pigs.
Figure 8
Figure 8
The proposed method retains the appearance features specific to moving images while aligning moving images to fixed images.

References

    1. Qiao Y., Kong H., Clark C., Lomax S., Su D., Eiffert S., Sukkarieh S. Intelligent perception-based cattle lameness detection and behaviour recognition: A review. Animals. 2021;11:3033. doi: 10.3390/ani11113033. - DOI - PMC - PubMed
    1. Shirke A., Saifuddin A., Luthra A., Li J., Williams T., Hu X., Kotnana A., Kocabalkanli O., Ahuja N., Green-Miller A., et al. Tracking Grow-Finish Pigs Across Large Pens Using Multiple Cameras. arXiv. 20212111.10971
    1. Tilman D., Balzer C., Hill J., Befort B.L. Global food demand and the sustainable intensification of agriculture. Proc. Natl. Acad. Sci. USA. 2011;108:20260–20264. doi: 10.1073/pnas.1116437108. - DOI - PMC - PubMed
    1. Collins L., Smith L. Smart agri-systems for the pig industry. Animal. 2022;8:100518. doi: 10.1016/j.animal.2022.100518. - DOI - PubMed
    1. Li G., Huang Y., Chen Z., Chesser Jr G.D., Purswell J.L., Linhoss J., Zhao Y. Practices and applications of convolutional neural network-based computer vision systems in animal farming: A review. Sensors. 2021;21:1492. doi: 10.3390/s21041492. - DOI - PMC - PubMed

LinkOut - more resources