Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2023 Sep 25:814:137461.
doi: 10.1016/j.neulet.2023.137461. Epub 2023 Aug 23.

Chronotate: An open-source tool for manual timestamping and quantification of animal behavior

Affiliations

Chronotate: An open-source tool for manual timestamping and quantification of animal behavior

Paul A Philipsberg et al. Neurosci Lett. .

Abstract

A core necessity to behavioral neuroscience research is the ability to accurately measure performance on behavioral assays, such as the novel object location and novel object recognition tasks. These tasks are widely used in neuroscience research and measure a rodent's instinct for investigating novel features as a proxy to test their memory of a previous experience. Automated tools for scoring behavioral videos can be cost prohibitive and often have difficulty distinguishing between active investigation of an object and simply being in close proximity to an object. As such, many experimenters continue to rely on hand scoring interactions using stopwatches, which makes it difficult to review scoring after-the-fact and results in the loss of temporal information. Here, we introduce Chronotate, a free, open-source tool to aid in manually scoring novel object behavior videos. The software consists of an interactive video player with keyboard integration for marking timestamps of behavioral events during video playback, making it simple to quickly score and review bouts of rodent-object interaction. In addition, Chronotate outputs detailed interaction bout data, allowing for nuanced behavioral performance analyses. Using this detailed temporal information, we demonstrate that novel object location performance peaks within the first 3 s of interaction time and preference for the novel location becomes reduced across the test session. Thus, Chronotate can be used to determine the temporal structure of interactions on this task and can provide new insight into the memory processes that drive this behavior. Chronotate is available for download at: https://github.com/ShumanLab/Chronotate.

Keywords: Novel object location; Novel object recognition; Open-source; Rodent behavior.

PubMed Disclaimer

Conflict of interest statement

Declaration of Competing Interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Figures

Figure 1:
Figure 1:. Chronotate for novel object location test scoring.
(A) Schematic of novel object location (NOL) protocol. Mice were handled for 5 days before being habituated to an empty arena for 2 days. On the 8th day of training, mice were re-habituated to the empty arena before a training session, where 2 objects were added to the arena. Following a 1hr, 4hr, or 1 day delay, mice were placed back into the arena with one of the objects relocated. (B) Chronotate graphical interface. A list of time-stamped interaction bouts is displayed on the right and a progress bar along the bottom of the interface allows for seeking to precise time points. (C) Default keyboard controls for scoring object interaction (J,K,L,;), speeding up or slowing down (W,S), skipping forward or back (D,A), and playing or pausing (space). Key bindings can be customized as desired by activating the ‘customize key binds’ dialog from the top of the main window.
Figure 2:
Figure 2:. Time-stamped outputs allow detailed assessment of inter-rater reliability.
(A) Preference scores from Chronotate are highly correlated to those scored with a stopwatch (Pearson’s r=0.99, p<0.0001, n=16 videos). (B) Preference scores by two independent raters are highly correlated (Pearson’s r = 0.93, p <0.0001, n=42 animals). (C, D) Cumulative exploration across a video with high inter-rater agreement (C) and a video with lower inter-rater agreement (D). The points corresponding to these two videos are noted in B.
Figure 3:
Figure 3:. Determining the time-course of object preference across the testing session.
(A) Animals successfully learned to discriminate between the novel and familiar object-location pair during the test session (paired t-test p=0.028, n=8 animals), with animals showing a preference for the novel object-location. (B) Mean cumulative preference for the novel object over the course of the session (blue, left axis) and mean cumulative total object exploration time (pink, right axis). Preference for investigating the novel object-location peaks at approximately 45 seconds after the start of the session, which corresponds to an average combined exploration of both objects of 2.7 seconds. Shaded error bars represent SEM.

References

    1. Krakauer JW, Ghazanfar AA, Gomez-Marin A, MacIver MA, Poeppel D, Neuroscience Needs Behavior: Correcting a Reductionist Bias, Neuron. 93 (2017) 480–490. 10.1016/j.neuron.2016.12.041. - DOI - PubMed
    1. Denninger JK, Smith BM, Kirby ED, Novel Object Recognition and Object Location Behavioral Testing in Mice on a Budget, JoVE. (2018) 58593. 10.3791/58593. - DOI - PMC - PubMed
    1. Leger M, Quiedeville A, Bouet V, Haelewyn B, Boulouard M, Schumann-Bard P, Freret T, Object recognition test in mice, Nat Protoc. 8 (2013) 2531–2537. 10.1038/nprot.2013.155. - DOI - PubMed
    1. Vogel-Ciernia A, Wood MA, Examining Object Location and Object Recognition Memory in Mice, Current Protocols in Neuroscience. 69 (2014) 8.31.1–8.31.17. 10.1002/0471142301.ns0831s69. - DOI - PMC - PubMed
    1. Mathis A, Mamidanna P, Cury KM, Abe T, Murthy VN, Mathis MW, Bethge M, DeepLabCut: markerless pose estimation of user-defined body parts with deep learning, Nat Neurosci. 21 (2018) 1281–1289. 10.1038/s41593-018-0209-y. - DOI - PubMed

Publication types