Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2024 Jan 24;11(1):231415.
doi: 10.1098/rsos.231415. eCollection 2024 Jan.

Discrimination of object information by bat echolocation deciphered from acoustic simulations

Affiliations

Discrimination of object information by bat echolocation deciphered from acoustic simulations

Yu Teshima et al. R Soc Open Sci. .

Abstract

High-precision visual sensing has been achieved by combining cameras with deep learning. However, an unresolved challenge involves identifying information that remains elusive for optical sensors, such as occlusion spots hidden behind objects. Compared to light, sound waves have longer wavelengths and can, therefore, collect information on occlusion spots. In this study, we investigated whether bats could perform advanced sound sensing using echolocation to acquire a target's occlusion information. We conducted a two-alternative forced choice test on Pipistrellus abramus with five different targets, including targets with high visual similarity from the front, but different backend geometries, i.e. occlusion spots or textures. Subsequently, the echo impulse responses produced by these targets, which were difficult to obtain with real measurements, were computed using three-dimensional acoustic simulations to provide a detailed analysis consisting of the acoustic cues that the bats obtained through echolocation. Our findings demonstrated that bats could effectively discern differences in target occlusion spot structure and texture through echolocation. Furthermore, the discrimination performance was related to the differences in the logarithmic spectral distortion of the occlusion-related components in the simulated echo impulse responses. This suggested that the bats obtained occlusion information through echolocation, highlighting the advantages of utilizing broadband ultrasound for sensing.

Keywords: FDTD simulation; occlusion spots; target discriminations.

PubMed Disclaimer

Conflict of interest statement

The authors declare there are no conflicts of interest.

Figures

Figure 1.
Figure 1.
(a) Front and top views of the five targets used in the experiment: T1 operant stimulus target; (b) top view of the experimental system of the two-alternative forced choice test; (c) top and side views of the digital three-dimensional target (T1) used to calculate echo simulation in the simulation space.
Figure 2.
Figure 2.
(a) Probability of choosing T1 (operant stimulus) for each bat in the four different two-alternative forced-choice tests; (b) spectrogram of the simulated echo impulse response (top panel) and FM convolved echo (bottom panel) for each target (T1–T5) (S: surface-related echoes, O: occulusion-related echoes); (c) average success rate (choosing T1) for each test trial (bar graph), surface-related, and occlusion-related log-spectral distortion for each target (T2–T5 and T1; line graph).
Figure 3.
Figure 3.
(a) Trajectory of the bats toward targets T1 (red) and T4 (black) in the T1 versus T4 test trial, where the solid and dotted lines represent the mean and standard deviation, respectively. Because T1 and T4 were randomly positioned on the left and right sides of the experimental setup, the trajectories for the trials in which T1 was positioned to the right were inverted. (b) The pulse position and direction of the bats during one trial toward T1 and one toward T4 in the T1 versus T4 test trial. (c) Log-spectral distortion (T4–T1) of the echoes calculated by acoustic simulation at positions P1–P4 (correct shown in red and incorrect shown in black), shown in (b).

Similar articles

References

    1. Kendoul F, Fantoni I, Nonami K. 2009. Optic flow-based vision system for autonomous 3D localization and control of small aerial vehicles. Rob. Auton. Syst. 57, 591-602. (10.1016/j.robot.2009.02.001) - DOI
    1. Feng D, Haase-Schütz C, Rosenbaum L, Hertlein H, Gläser C, Timm F, Wiesbeck W, Dietmayer K. 2021. Deep Multi-Modal Object Detection and Semantic Segmentation for Autonomous Driving: Datasets, Methods, and Challenges. IEEE Trans. Intell. Transp. Syst. 22, 1341-1360. (10.1109/TITS.2020.2972974) - DOI
    1. Arnold E, Al-Jarrah OY, Dianati M, Fallah S, Oxtoby D, Mouzakitis A. 2019. A Survey on 3D Object Detection Methods for Autonomous Driving Applications. IEEE Trans. Intell. Transp. Syst. 20, 3782-3795. (10.1109/TITS.2019.2892405) - DOI
    1. Feng X, Ma Y, Gao L. 2022. Compact light field photography towards versatile three-dimensional vision. Nat. Commun. 13, 3333. (10.1038/s41467-022-31087-9) - DOI - PMC - PubMed
    1. Horaud R, Hansard M, Evangelidis G, Ménier C. 2016. An overview of depth cameras and range scanners based on time-of-flight technologies. Mach. Vis. Appl. 27, 1005-1020. (10.1007/s00138-016-0784-4) - DOI

LinkOut - more resources