Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2020 Jan 17:10:3001.
doi: 10.3389/fpsyg.2019.03001. eCollection 2019.

The Influence of Auditory Cues on Bodily and Movement Perception

Affiliations
Review

The Influence of Auditory Cues on Bodily and Movement Perception

Tasha R Stanton et al. Front Psychol. .

Abstract

The sounds that result from our movement and that mark the outcome of our actions typically convey useful information concerning the state of our body and its movement, as well as providing pertinent information about the stimuli with which we are interacting. Here we review the rapidly growing literature investigating the influence of non-veridical auditory cues (i.e., inaccurate in terms of their context, timing, and/or spectral distribution) on multisensory body and action perception, and on motor behavior. Inaccurate auditory cues provide a unique opportunity to study cross-modal processes: the ability to detect the impact of each sense when they provide a slightly different message is greater. Additionally, given that similar cross-modal processes likely occur regardless of the accuracy or inaccuracy of sensory input, studying incongruent interactions are likely to also help us predict interactions between congruent inputs. The available research convincingly demonstrates that perceptions of the body, of movement, and of surface contact features (e.g., roughness) are influenced by the addition of non-veridical auditory cues. Moreover, auditory cues impact both motor behavior and emotional valence, the latter showing that sounds that are highly incongruent with the performed movement induce feelings of unpleasantness (perhaps associated with lower processing fluency). Such findings are relevant to the design of auditory cues associated with product interaction, and the use of auditory cues in sport performance and therapeutic situations given the impact on motor behavior.

Keywords: auditory; body perception; emotional valence; movement; multisensory integration; perception.

PubMed Disclaimer

Figures

FIGURE 1
FIGURE 1
Classification of auditory cue types.
FIGURE 2
FIGURE 2
Experimental set-up and results from Senna et al.’s (2014) marble hand illusion Experiment 1. (A) Experimental set-up; (B) Experimental conditions of temporally synchronous sound and skin tap (marble hand illusion) or asynchronous sound and touch (control); (C) Results for perceived finger properties assessed via questionnaire (Mean change [post- minus pre-testing] ± standard error of the mean); p < 0.05; ∗∗p < 0.01; (D) Results for arousal to a threatening stimuli as measured using galvanic skin response (GSR), with findings showing an increase in arousal for the marble hand illusion condition but not the control condition (mean and standard error of the mean shown); (E) Relationship between perceived hand stiffness and mean arousal (GSR) for the marble hand illusion condition. A positive significant correlation (Pearson’s r = 0.6, p = 0.02) was found between changes in perceived hand stiffness and changes in arousal (larger dots represent two points falling in close proximity). [Reproduction of Figure 1 of Senna et al. (2014). Reproduced via the Creative Commons Attribution (CC BY) License]. Color not needed for publication.
FIGURE 3
FIGURE 3
Results from Tajadura-Jiménez et al.’s (2015a) experiment highlighting the consequences of frequency manipulation (normal, low, high) of the sound of footsteps while walking. They explore the effects of this manipulation on perceived body weight, galvanic skin response (GSR), heel contact time (milliseconds) during walking, and acceleration of the foot’s upward movement during walking. All values are mean ± standard error. Denotes a significant difference between means. [From Figure 4 of Tajadura-Jiménez et al. (2015a). Reproduced with permission of the Association for Computing Machinery in the format Republish in a journal/magazine via Copyright Clearance Center].
FIGURE 4
FIGURE 4
Results from Kim et al.’s (2007) study evaluating the influence of amplifying and attenuating sound frequency components on: (A) perceived roughness; (B) perceived ‘ruggedness.’ Snd1 = Sound 1 the sound of rubbing sandpaper (Sound 1, grit size #24) was provided alone (“no Tactile info”), or was paired with haptic interaction (index finger) using two different tactile settings on a texture display mouse. The auditory stimuli were divided into various sound frequencies (where 1 = 20–36 Hz; 2 = 36–56 Hz; 3 = 56–96 Hz; 4 = 96 – 174 Hz; 5 = 174 – 284 Hz; 6 = 284 – 584 Hz) that were either amplified by 20 dB (+) or attenuated by 20 dB (–). The sound index (x-axis) refers to the frequency-intensity combination. The Sensitivity strength (y-axis) refers to ratings provided on 7-point Likert scale (midpoint = neutral) for perceived roughness and ruggedness. These results highlight that amplifying a 30 – 600 Hz frequency range results in increased perceptions of roughness in all conditions. Amplifying a 30 – 300 Hz frequency range results in increased perceptions of ruggedness during virtual haptic interaction (touch conditions) and 56 – 96 Hz during no sound conditions. [From Figures 4, 7 of Kim et al. (2007). Reprinted, with permission from IEEE Proceedings (Computer Society) in the format Republish in a journal/magazine via Copyright Clearance Center].
FIGURE 5
FIGURE 5
Results from Juravle and Spence’s (2011) study highlighting that movement-induced sensory perception changes are specific to the type of sensory input and its relevance to the movement. E1 refers to Experiment 1 and shows that the sensitivity (y-axis) in detecting a gap in tactile stimulation was lower when delivered to the hand that was juggling than when delivered to the hand at rest. E2 refers to Experiment 2 and shows that the sensitivity in detecting a gap in auditory stimuli was higher when delivered during the juggling condition. Box plots represent the middle 50% of data (dark line = median) and whiskers represent the interquartile range (+sign = values 1.5 times the inter-quartiles range). [From Figure 1a of Juravle and Spence (2011). Reproduced with permission of Springer Nature under Copyright Transfer Agreement and License to Publish Agreement of the author which allows adaption of figures for style and formatting purposes under the condition that this does not alter the meaning of the content].
FIGURE 6
FIGURE 6
Experimental set-up and results from Zampini and Spence’s (2004) study demonstrating the influence of manipulating biting sounds (airborne component) when biting into a potato chip. (A) Experimental set-up of participant; note that during testing, the booth door was closed and participants provided responses via computer screens situated through the wall (left-hand side) of the booth. (B) Perceived crispness of the chip (y-axis) during sound frequency alteration (x-axis) and 3 sound intensity conditions (0 dB, –20 dB, –40 dB). Results show that amplifying the high frequency components of the sound increased perceptions of crispness (unless sound is very quiet: –40 dB), and decreasing sound intensity increase perceptions of chip softness. (C) Perceived freshness of the chip (y-axis) during sound frequency alteration (x-axis) and 3 sound intensity conditions (0 dB, –20 dB, –40 dB). Amplifying the high frequency components of the sound increased perceptions of freshness (unless sound is very quiet: –40 dB), and decreasing sound intensity increase perceptions of chip staleness. [From Figures 1, 2a,b of Zampini and Spence (2004). Reproduced with permission of John Wiley & Sons publication under author agreements which allow the author reuse of up to three figures to republish in a new journal article].

Similar articles

Cited by

References

    1. Aglioti S., Pazzaglia M. (2010). Representing actions through their sound. Exp. Brain Res. 206 141–151. 10.1007/s00221-010-2344-x - DOI - PubMed
    1. Agostini T., Righi G., Galmonte A., Bruno P. (2004). “The relevance of auditory information in optimizing hammer throwers performance,” in Biomechanics and Sports, ed. Pascolo P. B. (Vienna: Springer; ), 67–74. 10.1007/978-3-7091-2760-5_9 - DOI
    1. Andreasen A., Geronazzo M., Nilsson N. C., Zovnercuka J., Konovalov K., Serafin S. (2019). Auditory feedback for navigation with echoes in virtual environments: training procedure and orientation strategies. IEEE Trans. Vis. Computer Graph. 25 1876–1886. 10.1109/TVCG.2019.2898787 - DOI - PubMed
    1. Andreasen A., Zovnercuka J., Konovalov K., Geronazzo M., Paisa R., Serafin S. (2018). “Navigate as a bat,” in Proceedings of the Real-time echolocation system in virtual reality, 15th Sound and Music Computing Conference, Limassol, 198–205.
    1. Angelaki D. E., Shaikh A. G., Green A. M., Dickman J. D. (2004). Neurons compute internal models of the physical laws of motion. Nature 430 560–564. 10.1038/nature02754 - DOI - PubMed

LinkOut - more resources