Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2024 Feb;56(2):986-1001.
doi: 10.3758/s13428-023-02082-9. Epub 2023 Mar 15.

GesturalOrigins: A bottom-up framework for establishing systematic gesture data across ape species

Affiliations

GesturalOrigins: A bottom-up framework for establishing systematic gesture data across ape species

Charlotte Grund et al. Behav Res Methods. 2024 Feb.

Abstract

Current methodologies present significant hurdles to understanding patterns in the gestural communication of individuals, populations, and species. To address this issue, we present a bottom-up data collection framework for the study of gesture: GesturalOrigins. By "bottom-up", we mean that we minimise a priori structural choices, allowing researchers to define larger concepts (such as 'gesture types', 'response latencies', or 'gesture sequences') flexibly once coding is complete. Data can easily be re-organised to provide replication of, and comparison with, a wide range of datasets in published and planned analyses. We present packages, templates, and instructions for the complete data collection and coding process. We illustrate the flexibility that our methodological tool offers with worked examples of (great ape) gestural communication, demonstrating differences in the duration of action phases across distinct gesture action types and showing how species variation in the latency to respond to gestural requests may be revealed or masked by methodological choices. While GesturalOrigins is built from an ape-centred perspective, the basic framework can be adapted across a range of species and potentially to other communication systems. By making our gesture coding methods transparent and open access, we hope to enable a more direct comparison of findings across research groups, improve collaborations, and advance the field to tackle some of the long-standing questions in comparative gesture research.

Keywords: GesturalOrigins; Gesture action phases; Language evolution; Video coding; Visual communication.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflicts of interest.

Figures

Fig. 1
Fig. 1
Gesture record: coding the gesture action and its modifiers – example: ‘reach’. Note. Bottom-up gesture type construction: coding the gesture action and the modifiers that describe the physical production of the gesture action in more detail taking the illustrated ‘reach’ instance as an example. Gesture record (parent annotation): which bodily movement is performed (here: ‘Reach’)? Body part signaller: which body part was used (here: ‘Arm’)? Signaller laterality: was it their left or right body part (here: ‘Right’)? Object Used: was the gesture produced using an object (here: ‘None’)? Flexion: in a free limb gesture action (one in which there is no required contact with an object, substrate, or additional body part in order to perform the action), we consider whether the elbow, wrist, and/or fingers were bent past 45° (here: ‘Elbow’)? Orientation: in a free limb gesture action, what direction does the palm of the hand or sole of the foot face (here: ‘Side’)? Repetition count: for gesture types that have rhythmic repeated movements, how many times was the action repeated (here: ‘No value’ as the ‘reach’ is not a repetition gesture action. Note that this tier is a free text variable.)? Body part contact: for contact gestures, we also code the recipient body part that the gesture contacts (here: ‘None’ as ‘reach’ is not a contact gesture action). Additionally, we estimate the audibility of the gesture instance and whether there was evidence of directionality, e.g., whether the reach was extended towards an individual or location of potential interest (Note: both variables not illustrated here). See the controlled vocabulary excel file (GOv1.0_Elan_controlled_vocabulary.xlsx) for full lists of options for the gesture action and modifier variables and the GesturalOrigins coding Protocol (GOv1.0_Protocol) for details on how to code each variable (both files are accessible here: https://github.com/CharlotteGrund/Gestural_Origins_Coding‐methods_paper)
Fig. 2
Fig. 2
Illustration of the different stroke phases of variable and stable gesture action types using ‘reach’ and ‘beckon’ as examples. Note. We assume that the information content within a gesture action may not be evenly spread over the whole gesture duration but may take more the form of discrete units (e.g., action and hold) and illustrate this with two example gesture actions. A ‘Reach’ (a variable gesture action type): the minimum action unit (MAU) and the gesture action (GA) start as soon as the signaller moves his arm from the neutral position (very light grey line), i.e., starts to gesture. The MAU ends when the movement phase is completed, i.e., the reach is in its maximal extension towards the recipient (dark grey line; MAU duration = neutral position to MAU end). The gesture action continues until the signaller starts to lower their arm, i.e., when the gesture is not in place anymore. This optional hold phase between the end of the MAU and the end of the GA is indicated by the grey airbrush fill (Gesture action duration = neutral position to GA end, i.e., end of hold/repetition phase). We also annotate when the arm is back in its neutral position to track the total time invested in the gesture production (Full gesture duration = neutral position to neutral position). B ‘Beckon’ (a stable gesture action type): the MAU and the GA start as soon as the signaller moves his arm from the neutral position (very light grey line), i.e., starts to gesture. The MAU ends when the movement phase is completed (i.e., after the full scooping beckon action). As this is a stable gesture action, there is no optional hold/repetition phase, and the gesture action ends at the same time as the MAU (MAU/GA duration = neutral position to MAU/GA end). As in the reach, we also annotate the time when the arm is back in its neutral position (Gesture duration = neutral position to neutral position)
Fig. 3
Fig. 3
Worked example 1: Difference in variability between the MAU duration, the gesture action duration, and the full gesture duration. Note. Boxplots of the gesture actions ‘beckon’ (n = 26), ‘raise’ (n = 54), and ‘reach’ (n = 85) showing the difference in duration variability between the A Minimum action unit (MAU) phase of a gesture action (MAU duration median for ‘beckon’ = 0.74 s (range = 0.26–1.75 s); for ‘raise’ = 0.82 s (range = 0.25–3.48 s); and for ‘reach’ = 0.76 s (range = 0.30–1.90 s; MAU duration mean for ‘beckon’: 0.74 s (SD = 0.37); for ‘raise’ = 0.91 s (SD = 0.60); and for ‘reach’ = 0.79 s (SD = 0.35)), B the time taken to produce and maintain the gesture action (Gesture action (GA) phase of a gesture action: GA duration median for ‘beckon’ = 0.81 s (range = 0.26–2.3 s); for ‘raise’ = 1.34 s (range = 0.37–12.64 s); and for ‘reach’ = 1.18 s (range = 0.35–5.04 s); GA duration mean for ‘beckon’: 0.95 s (SD = 0.48); for ‘raise’ = 2.55 s (SD = 2.75); and for ‘reach’ = 1.36 s (SD = 0.87)) and C including the recovery phase of the gesture (Full gesture duration median for ‘beckon’ = 1.00 s (range = 0.44–2.67 s); for ‘raise’ = 1.90 s (range = 0.44–12.64 s); and for ‘reach’ = 1.58 s (range = 0.53–6.10 s); Full gesture duration mean for ‘beckon’: 1.17 s (SD = 0.51); for ‘raise’ = 2.87 s (SD = 2.82); and for ‘reach’ = 1.84 s (SD = 1.02)). Species = Gorilla beringei beringei; n = 165 instances of gesture use, n = 26 signallers, filtered from data of mountain gorilla gestural behaviour collected on four social units (Mukiza, Oruzogo, Kyagurilo, Bitukura) in Bwindi Impenetrable National Park, Uganda, between 2019-2022 – data in ESM: Worked_example_1_data.csv)
Fig. 4
Fig. 4
Visualisation of different ways in which response latencies and inter-gesture intervals can be measured using the GesturalOrigins coding scheme. Note. Each communication can be viewed as a string of behaviours on a timeline that starts with the gesturing of one individual towards another and ends with a particular behavioural outcome (typically, either the goal of the signalling individual (ASO) or the failure of the communication). In this example of a simple gestural communication, individual A (the signaller) on the left gestures twice and individual B (the recipient) responds with an apparently satisfactory outcome (the ‘goal’ of the communication) after a certain amount of time has elapsed. If we take the inter-gesture interval as starting at the end of the gesture action duration and ending at the onset of the next gesture, it is < 1 s and would not meet the time-based cut off for response-waiting to be present and both gestures would be considered part of the same sequence. However, when investigating inter-gesture intervals and response-waiting times the end of the MAU may be of particular interest, because it approximates the point in time where all the necessary information for the gesture action to be understood as a case of this particular gesture action should be in place. If we consider the inter-gesture interval as starting at the end of the MAU and ending at the onset of the next gesture, the time-interval in this example is then > 1 s and would meet the time-based cut off for response-waiting, and the two gesture instances would not be part of the same sequence. Apart from inter-gesture intervals the coding scheme offers great flexibility in calculating recipient response latencies (see worked example 2). Some of them are indicated with purple and grey lines leading from different gesture end points to the Outcome (green goal box). Note that for the purpose of clarity and regarding the fact that the difference between gesture action duration and full gesture duration seems consistent and small (see worked example 1) no distinction is made between these two durations in the graph
Fig. 5
Fig. 5
Worked example 2.1: Mountain gorilla and chimpanzee recipient latencies to (behaviourally) respond to a signaller’s request to be groomed using the ‘present’ gesture action (n = 281 successful communications). Note. Boxplots showing the response waiting times (latencies to respond) in East African chimpanzees (EAC; n = 177 communications, n = 70 signallers) and mountain gorillas (MG; n = 104 communications, n = 27 signallers) for the gesture action ‘present’ and the outcome ‘grooming’ (total: n = 281 communications) when considering either A the minimum action unit (MAU) end point or B the gesture action (GA) end point as the start of response waiting (see Fig. 4 for a conceptual visualisation of different response latency measurements). Mountain gorillas seem to be slower to respond to grooming requests than East African chimpanzees. Note that a single value (the maximum value of 78.1 s for the mountain gorillas) is not visually represented on the graph (though not omitted from the scaling) for the purpose of better resolution. As the ‘present’ gesture action has the characteristic of being held in place until the goal is fulfilled (in successful communications) there is unsurprisingly little variation in the gesture action to outcome latency (graph B), which is close to 0 in both species (MG GA.Outcome latency range = – 0.2–1.1; EAC GA.Outcome latency range = – 1.5–1.0 s). The negative latencies result from instances where the recipient already responded to the gesture before the respective action phase (MAU and/or GA) was completed
Fig. 6
Fig. 6
Worked example 2.2: Mountain gorilla and chimpanzee recipient latencies to (behaviourally) respond to a signaller’s gestures (successful communications, all goals except ‘play’). Note. Boxplots showing the response waiting times (latencies to respond) in East African chimpanzees (EAC) and mountain gorillas (MG) in 958 successful communications (MG: n = 250 communications, n = 37 signallers; EAC: n = 707 communications, n = 115 signallers – including all gesture actions, excluding the goal ‘play’ – data in ESM: Worked_example_2.2_data.csv) when considering either the MAU end point (A and C) or the gesture action end point (B and D) as starts of response waiting. The graph A includes the full range of latency values observed for MAU end to Outcome (range: – 2.1–78.1) and graph B the full range of latency values for Gesture action end to Outcome (range: – 7.8–31.6) while the graphs C and D show only those latencies with values between – 5 and 25 s (for a better resolution on where the majority of the data lies, while not omitting the more extreme values from the scaling). The data suggest that mountain gorillas take longer to respond to gestural requests as compared to chimpanzees when considering the MAU end point as the start of response waiting as well as when considering the GA end point as the start of response waiting. As in worked example 2.1, the negative latencies result from instances where the recipient already responded to the gesture before the respective action phase (MAU and/or GA) was completed

References

    1. Altmann J. Observational study of behavior: Sampling methods. Behaviour. 1974;49(3–4):227–266. doi: 10.1163/156853974X00534. - DOI - PubMed
    1. Arnold K, Bar-On D. Primate pragmatics, expressive behavior, and the evolution of language. Animal Behavior and Cognition. 2020;7(2):117–130. doi: 10.26451/abc.07.02.06.2020. - DOI
    1. Baker M. Reproducibility crisis. Nature. 2016;533(26):353–366. - PubMed
    1. Bain M, Nagrani A, Schofield D, Berdugo S, Bessa J, Owen J, Zisserman A. Automated audiovisual behavior recognition in wild primates. Science Advances. 2021;7(46):eabi4883. doi: 10.1126/sciadv.abi4883. - DOI - PMC - PubMed
    1. Bard KA. Intentional behavior and intentional communication in young free-ranging orangutans. Child Development. 1992;63(5):1186–1197. doi: 10.2307/1131526. - DOI - PubMed

Publication types