Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2021 Jan 19:14:606590.
doi: 10.3389/fnbeh.2020.606590. eCollection 2020.

The Critical Role of Head Movements for Spatial Representation During Bumblebees Learning Flight

Affiliations

The Critical Role of Head Movements for Spatial Representation During Bumblebees Learning Flight

Charlotte Doussot et al. Front Behav Neurosci. .

Abstract

Bumblebees perform complex flight maneuvers around the barely visible entrance of their nest upon their first departures. During these flights bees learn visual information about the surroundings, possibly including its spatial layout. They rely on this information to return home. Depth information can be derived from the apparent motion of the scenery on the bees' retina. This motion is shaped by the animal's flight and orientation: Bees employ a saccadic flight and gaze strategy, where rapid turns of the head (saccades) alternate with flight segments of apparently constant gaze direction (intersaccades). When during intersaccades the gaze direction is kept relatively constant, the apparent motion contains information about the distance of the animal to environmental objects, and thus, in an egocentric reference frame. Alternatively, when the gaze direction rotates around a fixed point in space, the animal perceives the depth structure relative to this pivot point, i.e., in an allocentric reference frame. If the pivot point is at the nest-hole, the information is nest-centric. Here, we investigate in which reference frames bumblebees perceive depth information during their learning flights. By precisely tracking the head orientation, we found that half of the time, the head appears to pivot actively. However, only few of the corresponding pivot points are close to the nest entrance. Our results indicate that bumblebees perceive visual information in several reference frames when they learn about the surroundings of a behaviorally relevant location.

Keywords: active vision; hymenopterans; motion-parallax; navigation; optic-flow; view-matching; visual homing; visual learning.

PubMed Disclaimer

Conflict of interest statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Figures

Figure 1
Figure 1
Experimental set-up. (A) Representation of the experimental set-up recreated with the software Blender. The bumblebee enters the flight arena through the nest-hole connected by a tube to the hive. The bumblebee takes off from the center of the arena. Learning flights were recorded by three cameras from above the arena. The flight arena was illuminated by four blocks of four LEDs. The roof of the arena was a transparent acrylic plate. (B) Single cropped frame from our footage showing a marked bumblebee during a learning flight; green arrows indicate the head markers and purple arrows point to the three thorax markers. (C) Photograph of the inside texture of the arena as used during experiments, showing nest-hole and exit-hole to the foraging. Walls are covered with a red noise pattern.
Figure 2
Figure 2
Head and thorax spatial orientation (A) The head coordinate-system: the bumblebee head with the three markers and the yaw, pitch and roll axis. (B) The world coordinate system: 3D representation of a learning flight's initial phase. (C) Top-view of the learning flight section showing the down-sampled yaw orientation. The head direction is indicated by the arrow's head. Time along the trajectory is indicated by the arrow head color, following the “color bar” at the right. Purple arrows indicate saccades and green arrows intersaccades. (D) Filtered time courses of the head YPR orientation for the flight section shown in (C), with yaw in purple, pitch in green, roll in blue. Each orientation is overlaid with the standard deviation of the error in degrees (too small to be visible). Rectified yaw velocity on the right axis (black). Gray shaded areas represents saccades determined by the two-thresholds method (see text): for the head, onset threshold (upper blue line) = 372.42°· s−1 and ending threshold 2 (lower red line) = 200.5°· s−1. (E) Filtered time courses of the thorax YPR orientation (left-axis) and yaw-rectified velocity (right-axis) for the flight section shown in (C), similar legends as in (D). Saccades as defined on the basis of head velocity is indicated by dotted blocks. (F) Top-view of a smaller learning flight section showing the down-sampled yaw orientation for the head and the thorax. The head direction is indicated by the arrow's tail in gray. The thorax direction by the arrow's head (green arrows). Time along the trajectory is indicated by the arrows head color. Time colored indication is similar to (E). Blue arrows indicate a positive head yaw drift above 5° and the orange a negative drift below 5°. (G) Filtered time courses of the head (gray line) and thorax yaw orientation (colored line). The blue and orange colors indicates strong drift as in (F), and gray indicates intersaccades with less drift. The thorax orientation is plotted with a colored line. Its color indicates time, and is the same as in (F). Head and thorax orientation are not aligned in the second half of the section, so the head is orientated rightward relative to the thorax.
Figure 3
Figure 3
Propagated error on the YPR orientations. From Left to right Kernel Density Estimation and distribution of the errors (variance) for the yaw, pitch and roll orientations in degrees. μ indicates the mean and σ the standard deviation.
Figure 4
Figure 4
Two active vision strategies. This is a schematic of the two active vision strategies and their impact on the retinal displacement of visual landmarks on the retina. For illustrative purposes the head orientation is considered on the drawing aligned with the body axis. (A) Motion-parallax. As a consequence of translation the bumblebee gains distance information about the landmarks relative to its own current position. Here the purple landmark moves slower on the bumblebee's eye, shorter retinal displacement (purple arrow) than the green landmark, longer retinal displacement (green arrow). Thus, the purple object is more distant to the bumblebee. (B) Pivoting-parallax. The bumblebee pivots around a point, the pivot point, by a certain rotation angle while translating. By doing so the bumblebee gains distance information relative to the pivot point. Here, the purple landmark moves in the opposite direction on the retinae to the green landmark (see corresponding arrows of retinal displacement), because the latter is placed in between the pivot point and the bumblebee. The black circle represents the zero-horopter (as named in Zeil, 1993a), which separates areas of image motion with opposite sign: inside the horopter, the green landmark follows the rotation of the bumblebee and outside, the purple landmark moves in the opposite direction. Equation for the horopters is given in Zeil (1993a).
Figure 5
Figure 5
Intersaccade yaw drift and pivoting points location. (A) Kernel density estimation of the yaw drift during intersaccades expressed in degrees per intersaccade. KDE for all flights (thick black line) and for each flight (colored lines, see legend). (B) Schematic illustration of the method for estimating the pivot point location. With a positive rotational speed, or drift angle, the pivot point lies in the heading direction of the bee. Note that a pivoting-parallax can be due to head rotation only, therefore head-thorax are not necessarily aligned during a pivoting-parallax. (C) Negative drift angle, the pivot point lies behind the heading direction.
Figure 6
Figure 6
Analysis of head and thorax rotations. (A) Yaw orientation during saccades: head's yaw average purple line, thorax' s yaw-average dotted line, the different flights are individually colored [blue, orange green, and red (same bee “a”), purple and brown (bees “b” and “c”)]. (B) Average Yaw velocity during saccades. (C) Boxplot of the distribution of the thorax's velocity peak delay with the head during saccades, i.e., a negative value indicates a negative delay. (D) From left to right, distribution of yaw, pitch, and roll angular velocities for the head (wz, wy, wx, respectively) during saccades (dotted line) and intersaccades (continuous line).
Figure 7
Figure 7
Pairwise comparison of the SNR for each intersaccadic interval. (A) Pairwise comparison of the SNR for the nest-hole retinal projection, for each flight (n = 6) color coded and with the different modifiers (none, roll constant, roll, and pitch constant). The motion-parallax SNR is on the x-axis and the pivoting-parallax SNR is on the y-axis; the bisection line is represented in red. The median log10 SNR for the pivoting and motion parallax are displayed with a green and purple dotted line, respectively. (B) same for the exit-hole SNR.
Figure 8
Figure 8
Pivoting points in the flight arena. Each subplot corresponds to one flight. Pivoting points are color-coded by a diverging color map depending on the drift angle of the corresponding intersaccade, i.e., a drift below 0, from white to blue; drift above 0 from white to red. The color map coloring is bounded to 8° for illustrative purposes. The arena walls are shown by the red circle. the nest-hole and platform are represented by the gray dot in the middle of the arena. The exit-hole is located at x = 0 and y = −350 mm.

References

    1. Ardin P., Mangan M., Wystrach A., Webb B. (2015). How variation in head pitch could affect image matching algorithms for ant navigation. J. Compar. Physiol. A Neuroethol. Sens. Neural Behav. Physiol. 201, 585–597. 10.1007/s00359-015-1005-8 - DOI - PMC - PubMed
    1. Avraamides M. N., Kelly J. W. (2008). Multiple systems of spatial memory and action. Cogn. Process. 9, 93–106. 10.1007/s10339-007-0188-5 - DOI - PubMed
    1. Baddeley B., Philippides A., Graham P., De Ibarra N. H., Collett T., Husbands P. (2009). What can be learnt from analysing insect orientation flights using probabilistic Slam? Biol. Cybern. 101, 169–182. 10.1007/s00422-009-0327-4 - DOI - PubMed
    1. Bertrand O. J., Lindemann J. P., Egelhaaf M. (2015). A bio-inspired collision avoidance model based on spatial information derived from motion detectors leads to common routes. PLoS Comput. Biol. 11:e1004339. 10.1371/journal.pcbi.1004339 - DOI - PMC - PubMed
    1. Boeddeker N., Dittmar L., Stürzl W., Egelhaaf M. (2010). The fine structure of honeybee head and body yaw movements in a homing task. Proc. R. Soc. B Biol. Sci. 277, 1899–1906. 10.1098/rspb.2009.2326 - DOI - PMC - PubMed

LinkOut - more resources