Markerless Mouse Tracking for Social Experiments
- PMID: 38233144
- PMCID: PMC10901195
- DOI: 10.1523/ENEURO.0154-22.2023
Markerless Mouse Tracking for Social Experiments
Abstract
Automated behavior quantification in socially interacting animals requires accurate tracking. While many methods have been very successful and highly generalizable to different settings, issues of mistaken identities and lost information on key anatomical features are common, although they can be alleviated by increased human effort in training or post-processing. We propose a markerless video-based tool to simultaneously track two interacting mice of the same appearance in controlled settings for quantifying behaviors such as different types of sniffing, touching, and locomotion to improve tracking accuracy under these settings without increased human effort. It incorporates conventional handcrafted tracking and deep-learning-based techniques. The tool is trained on a small number of manually annotated images from a basic experimental setup and outputs body masks and coordinates of the snout and tail-base for each mouse. The method was tested on several commonly used experimental conditions including bedding in the cage and fiberoptic or headstage implants on the mice. Results obtained without any human corrections after the automated analysis showed a near elimination of identities switches and a ∼15% improvement in tracking accuracy over pure deep-learning-based pose estimation tracking approaches. Our approach can be optionally ensembled with such techniques for further improvement. Finally, we demonstrated an application of this approach in studies of social behavior of mice by quantifying and comparing interactions between pairs of mice in which some lack olfaction. Together, these results suggest that our approach could be valuable for studying group behaviors in rodents, such as social interactions.
Keywords: computer vision; deep learning; mouse tracking; social behavior.
Copyright © 2024 Le et al.
Conflict of interest statement
The authors declare no competing financial interests.
Figures










Similar articles
-
Automated measurement of mouse social behaviors using depth sensing, video tracking, and machine learning.Proc Natl Acad Sci U S A. 2015 Sep 22;112(38):E5351-60. doi: 10.1073/pnas.1515982112. Epub 2015 Sep 9. Proc Natl Acad Sci U S A. 2015. PMID: 26354123 Free PMC article.
-
OpenApePose, a database of annotated ape photographs for pose estimation.Elife. 2023 Dec 11;12:RP86873. doi: 10.7554/eLife.86873. Elife. 2023. PMID: 38078902 Free PMC article.
-
Mouse Social Interaction Test (MoST): a quantitative computer automated analysis of behavior.J Neural Transm (Vienna). 2017 Jan;124(1):3-11. doi: 10.1007/s00702-015-1487-0. Epub 2015 Dec 24. J Neural Transm (Vienna). 2017. PMID: 26704381
-
Machine vision methods for analyzing social interactions.J Exp Biol. 2017 Jan 1;220(Pt 1):25-34. doi: 10.1242/jeb.142281. J Exp Biol. 2017. PMID: 28057825 Review.
-
Applications and limitations of current markerless motion capture methods for clinical gait biomechanics.PeerJ. 2022 Feb 25;10:e12995. doi: 10.7717/peerj.12995. eCollection 2022. PeerJ. 2022. PMID: 35237469 Free PMC article. Review.
Cited by
-
vmTracking enables highly accurate multi-animal pose tracking in crowded environments.PLoS Biol. 2025 Feb 10;23(2):e3003002. doi: 10.1371/journal.pbio.3003002. eCollection 2025 Feb. PLoS Biol. 2025. PMID: 39928646 Free PMC article.
-
Marker-less tracking system for multiple mice using Mask R-CNN.Front Behav Neurosci. 2023 Jan 6;16:1086242. doi: 10.3389/fnbeh.2022.1086242. eCollection 2022. Front Behav Neurosci. 2023. PMID: 36688129 Free PMC article.
-
Does advancement in marker-less pose-estimation mean more quality research? A systematic review.Front Behav Neurosci. 2025 Aug 22;19:1663089. doi: 10.3389/fnbeh.2025.1663089. eCollection 2025. Front Behav Neurosci. 2025. PMID: 40917374 Free PMC article.
References
-
- Abdulla W (2017) Mask R-CNN for object detection and instance segmentation on Keras and TensorFlow. Github. Available at: https://github.com/matterport/Mask_RCNN.
-
- Bernardin K, Stiefelhagen R (2008) Evaluating multiple object tracking performance: the clear MOT metrics. EURASIP J Image Video Process 2008:1–10. 10.1155/2008/246309 - DOI
-
- Burgos-Artizzu XP, Dollár P, Lin D, Anderson DJ, Perona P (2012) Social behavior recognition in continuous video. In: IEEE conference on computer vision and pattern recognition, pp 1322–1329, Providence, RI, USA.
MeSH terms
LinkOut - more resources
Full Text Sources