Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2022 Aug 9;51(10):3010066221116480.
doi: 10.1177/03010066221116480. Online ahead of print.

Investigating distortions in perceptual stability during different self-movements using virtual reality

Affiliations

Investigating distortions in perceptual stability during different self-movements using virtual reality

Paul A Warren et al. Perception. .

Abstract

Using immersive virtual reality (the HTC Vive Head Mounted Display), we measured both bias and sensitivity when making judgements about the scene stability of a target object during both active (self-propelled) and passive (experimenter-propelled) observer movements. This was repeated in the same group of 16 participants for three different observer-target movement conditions in which the instability of a target was yoked to the movement of the observer. We found that in all movement conditions that the target needed to move with (in the same direction) as the participant to be perceived as scene-stable. Consistent with the presence of additional available information (efference copy) about self-movement during active conditions, biases were smaller and sensitivities to instability were higher in these relative to passive conditions. However, the presence of efference copy was clearly not sufficient to completely eliminate the bias and we suggest that the presence of additional visual information about self-movement is also critical. We found some (albeit limited) evidence for correlation between appropriate metrics across different movement conditions. These results extend previous findings, providing evidence for consistency of biases across different movement types, suggestive of common processing underpinning perceptual stability judgements.

Keywords: perception and action; perceptual stability; self-movement; virtual reality.

PubMed Disclaimer

Conflict of interest statement

Declaration of Conflicting Interests: The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Figures

Figure 1.
Figure 1.
Schematic illustration of the three observer movement/target movement combinations.
Figure 2.
Figure 2.
Example psychometric functions for two (P15 and P16) of our 16 participants in the six conditions considered. In each panel, the dots correspond to local average estimates over the binary response variable. Curves correspond to fitted psychometric functions to 120 underlying binary responses using a cumulative Gaussian psychometric function form.
Figure 3.
Figure 3.
PSS (left panel) and Gaussian s.d. (right panel) data for the three OTMCs and both active (A) and passive (P) movement generation types. Boxplots illustrate median (thick horizontal line), 25th and 75th percentiles (hinges) and data points less than 1.5 × IQR from the hinge (whiskers).
Figure 4.
Figure 4.
Normalized perceived self-movement distance that would explain the bias observed in our six conditions. Horizontal dashed lines on the Active, (L + R):R condition represent approximate range of equivalent data from Tcheang et al. (2005). The two stars on the S:S condition data represent approximate mean values for the equivalent conditions from Wexler (2003).

Similar articles

Cited by

References

    1. Dokka K., MacNeilage P. R., DeAngelis G. C., Angelaki D. E. (2015). Multisensory self-motion compensation during object trajectory judgments. Cerebral Cortex, 25(3), 619–630. 10.1093/cercor/bht247 - DOI - PMC - PubMed
    1. Durgin F. H., Gigone K., Scott R. (2005). Perception of Visual Speed While Moving. Journal of Experimental Psychology: Human Perception and Performance, 31(2), 339–353. 10.1037/0096-1523.31.2.339 - DOI - PubMed
    1. Evans L., Champion R. A., Rushton S. K., Montaldi D., Warren P. A. (2020). Detection of scene-relative object movement and optic flow parsing across the adult lifespan. Journal of Vision, 20(9), 12–12. 10.1167/jov.20.9.12 - DOI - PMC - PubMed
    1. Fajen B. R., Matthis J. S. (2013). Visual and non-visual contributions to the perception of object motion during self-motion. PLoS One, 8(2), e55446. 10.1371/journal.pone.0055446 - DOI - PMC - PubMed
    1. Fajen B. R., Parade M. S., Matthis J. S. (2013). Humans perceive object motion in world coordinates during obstacle avoidance. Journal of Vision, 13(8), 25. 10.1167/13.8.25 - DOI - PMC - PubMed

LinkOut - more resources