Affordable Access

deepdyve-link
Publisher Website

Oscillations make a self-scaled model for honeybees’ visual odometer reliable regardless of flight trajectory

Authors
  • Bergantin, Lucia
  • Harbaoui, Nesrine
  • Raharijaona, Thibaut
  • Ruffier, Franck
Type
Published Article
Journal
Journal of The Royal Society Interface
Publisher
The Royal Society
Publication Date
Sep 08, 2021
Volume
18
Issue
182
Identifiers
DOI: 10.1098/rsif.2021.0567
PMID: 34493092
PMCID: PMC8424324
Source
PubMed Central
Keywords
Disciplines
  • Research Articles
License
Unknown

Abstract

Honeybees foraging and recruiting nest-mates by performing the waggle dance need to be able to gauge the flight distance to the food source regardless of the wind and terrain conditions. Previous authors have hypothesized that the foragers’ visual odometer mathematically integrates the angular velocity of the ground image sweeping backward across their ventral viewfield, known as translational optic flow. The question arises as to how mathematical integration of optic flow (usually expressed in radians/s) can reliably encode distances, regardless of the height and speed of flight. The vertical self-oscillatory movements observed in honeybees trigger expansions and contractions of the optic flow vector field, yielding an additional visual cue called optic flow divergence. We have developed a self-scaled model for the visual odometer in which the translational optic flow is scaled by the visually estimated current clearance from the ground. In simulation, this model, which we have called SOFIa, was found to be reliable in a large range of flight trajectories, terrains and wind conditions. It reduced the statistical dispersion of the estimated flight distances approximately 10-fold in comparison with the mathematically integrated raw optic flow model. The SOFIa model can be directly implemented in robotic applications based on minimalistic visual equipment.

Report this publication

Statistics

Seen <100 times