Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2013 Jul 25;13(8):25.
doi: 10.1167/13.8.25.

Humans perceive object motion in world coordinates during obstacle avoidance

Affiliations

Humans perceive object motion in world coordinates during obstacle avoidance

Brett R Fajen et al. J Vis. .

Abstract

A fundamental question about locomotion in the presence of moving objects is whether movements are guided based upon perceived object motion in an observer-centered or world-centered reference frame. The former captures object motion relative to the moving observer and depends on both observer and object motion. The latter captures object motion relative to the stationary environment and is independent of observer motion. Subjects walked through a virtual environment (VE) viewed through a head-mounted display and indicated whether they would pass in front of or behind a moving obstacle that was on course to cross their future path. Subjects' movement through the VE was manipulated such that object motion in observer coordinates was affected while object motion in world coordinates was the same. We found that when moving observers choose routes around moving obstacles, they rely on object motion perceived in world coordinates. This entails a process, which has been called flow parsing (Rushton & Warren, 2005; Warren & Rushton, 2009a), that recovers the component of optic flow due to object motion independent of self-motion. We found that when self-motion is real and actively generated, the process by which object motion is recovered relies on both visual and nonvisual information to factor out the influence of self-motion. The remaining component contains information about object motion in world coordinates that is needed to guide locomotion.

Keywords: flow parsing; locomotion; moving objects; obstacle avoidance; optic flow.

PubMed Disclaimer

Figures

Figure 1
Figure 1
Optic flow field and decomposition into self-motion and object-motion components. (A) Optic flow field generated by an observer moving over a ground surface and an object (yellow dot) moving from right to left. (B) Component of optic flow due to self-motion independent of object motion. (C) Component of optic flow due to object motion independent of self-motion. The optic flow field in (A) is the vector sum of the self-motion (B) and object-motion (C) components. From Fajen, B. R., & Matthis, J. S. (2013). Visual and non-visual contributions to the perception of object motion during self-motion. PLoS One, 8(2): e55446. doi:10.1371/journal.pone.0055446, used under a Creative Commons Attribution License.
Figure 2
Figure 2
Screenshot and task. (A) Screenshot of virtual environment viewed through HMD. (B) Plan view of observer moving straight ahead and object moving from right to left toward an unmarked location (×) 3, 4, or 5 m from the home location. (C) Lateral shift manipulation applied in Session A-Shift and Session B-Shift trials. Observer's position in the virtual environment was shifted to the left by 20% of his or her forward displacement.
Figure 3
Figure 3
Schematic diagram of design of experiment. The four main quadrants represent trials with 0% lateral shift (red) and trials with 20% lateral shift (blue) in Sessions A and B. Session A comprised 120 trials with 0% lateral shift (solid red) and 24 randomly interspersed catch trials with 20% lateral shift (checkered blue). Session B comprised 120 trials with 20% lateral shift (solid blue) and 24 randomly interspersed catch trials with 0% lateral shift (checkered red).
Figure 4
Figure 4
Summary of results. (A) and (C) show the subset of conditions used for analyses shown in (B) and (D), respectively. Error bars represent ±1 SE and asterisks denote statistically significant differences.
Figure 5
Figure 5
Adaptation of perceived direction of self-motion based on nonvisual information in Session B. Over repeated Session B-Shift trials with the optic flow field shifted leftward, perceived direction based on nonvisual information (NV) shifted leftward toward the optically specified direction of self-motion (V). The effects of adaptation carried over to Session B-No Shift trials, in which the lateral shift was not applied.
Figure 6
Figure 6
Flow parsing with and without lateral shift. Optic flow fields with moving object for the no lateral shift (A, C) and 20% leftward shift (E) conditions. (B, D, F) show the parsing of the local optical motion of the moving object (solid line) into self-motion (dashed lines) and object-motion (dotted lines) components. V and NV indicate the perceived direction of self-motion based on visual and nonvisual information, respectively.

References

    1. Bruggeman H., Warren W. H. (2010). The direction of walking—but not throwing or kicking—is adapted by optic flow. Psychological Science , 21 (7), 1006–1013 - PMC - PubMed
    1. Bruggeman H., Zosh W., Warren W. H. (2007). Optic flow drives human visuo-locomotor adaptation. Current Biology, 17 (23), 2035–2040 - PMC - PubMed
    1. Calabro F. J., Soto-Faraco S., Vaina L. M. (2011). Acoustic facilitation of object movement detection during self-motion. Proceedings of the Royal Society B-Biological Sciences, 278 (1719), 2840–2847 - PMC - PubMed
    1. Campos J. L., Byrne P., Sun H. J. (2010). The brain weights body-based cues higher than vision when estimating walked distances. European Journal of Neuroscience, 31 (10), 1889–1898 - PubMed
    1. Chardenon A., Montagne G., Laurent M., Bootsma R. J. (2005). A robust solution for dealing with environmental changes in intercepting moving balls. Journal of Motor Behavior, 37 (1), 52–64 - PubMed

Publication types

LinkOut - more resources