Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2015 Jun 30;112(26):8142-7.
doi: 10.1073/pnas.1500361112. Epub 2015 Jun 15.

Unifying account of visual motion and position perception

Affiliations

Unifying account of visual motion and position perception

Oh-Sang Kwon et al. Proc Natl Acad Sci U S A. .

Abstract

Despite growing evidence for perceptual interactions between motion and position, no unifying framework exists to account for these two key features of our visual experience. We show that percepts of both object position and motion derive from a common object-tracking system--a system that optimally integrates sensory signals with a realistic model of motion dynamics, effectively inferring their generative causes. The object-tracking model provides an excellent fit to both position and motion judgments in simple stimuli. With no changes in model parameters, the same model also accounts for subjects' novel illusory percepts in more complex moving stimuli. The resulting framework is characterized by a strong bidirectional coupling between position and motion estimates and provides a rational, unifying account of a number of motion and position phenomena that are currently thought to arise from independent mechanisms. This includes motion-induced shifts in perceived position, perceptual slow-speed biases, slowing of motions shown in visual periphery, and the well-known curveball illusion. These results reveal that motion perception cannot be isolated from position signals. Even in the simplest displays with no changes in object position, our perception is driven by the output of an object-tracking system that rationally infers different generative causes of motion signals. Taken together, we show that object tracking plays a fundamental role in perception of visual motion and position.

Keywords: Kalman filter; causal inference; motion-induced position shift; object tracking; visual motion perception.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Fig. 1.
Fig. 1.
Schematic illustration of the object-tracking model and its behavior. (A) An example of an object with both object boundary motion and pattern motion. (B) A generative model of the Bayesian observer. White nodes indicate hidden variables and gray nodes indicate observable variables that are noisy measurements of the connected hidden variables. Arrows indicate causal links. (C) Model behavior for a typical MIPS stimulus containing a moving pattern within a static envelope. The steady-state estimates of the three object states (position, object velocity, and pattern velocity) are plotted for different positional uncertainties. At low positional uncertainty, most of the retinal texture motion is correctly attributed to the pattern motion. Consequently, illusory object motion and MIPS are negligible. At high positional uncertainty, much of the texture motion is attributed to object motion (reflecting a prior that object motion is more likely than pattern motion). This results in relatively low estimated pattern velocity and large MIPS.
Fig. 2.
Fig. 2.
Experiment 1 stimuli and results. (A) Two types of spatial envelopes. (B) In the position task, subjects judged whether the right stimulus was above or below the left stimulus. For the two stimuli to be perceived at the same vertical locations, the physical location of the downward moving stimulus has to be higher than the position of upward moving stimulus (as depicted). (C) In the speed task, subjects judged whether the test stimulus pattern motion was faster or slower than the reference speed. For the two pattern speeds to be perceived as equal, the physical speed of the stimulus in the far periphery has to be faster than the reference stimulus (as depicted). (D) The position shift increases with increasing eccentricity [F(2,8) = 147.3, P < 10−15] or when the envelope boundary is blurred [F(1,4) = 68.3, P = 0.001]. (E) In contrast, the perceived speed decreases with increasing eccentricity [F(2,8) = 17.1, P = 0.001] or the envelope boundary is blurred [F(1,4) = 10.4, P = 0.032]. (F) The strong negative relationship between MIPS magnitude and changes in the perceived speed. The best-fitting model (blue symbols and lines in D, E, and F) closely fits the data (see Fig. S2 for models fits to the entire dataset).
Fig. 3.
Fig. 3.
Experiment 2 stimuli and results. (A) Left panel is a schematic illustration of the curveball illusion stimulus. Thin red line represents pattern motion, and bold black line represents object motion. Right panel shows the temporal dynamics of the model’s estimates of object position and velocity (bold black line). (B) Subjects’ perceived horizontal object velocity as indicated by two different estimates: by object motion direction (dashed line) and by the change in object position over time (full line). The results reveal a large conflict between the perceived horizontal velocity and the change in perceived position. (C) Predictions of the object-tracking model closely matches subjects’ estimates of perceived horizontal velocity. The model predictions are derived from subjects’ data in experiment 1 (i.e., the model has no free parameters).
Fig. 4.
Fig. 4.
Experiment 3 stimuli and results. (A) Schematic illustration of the illusory rotation stimulus and models’ estimates of object motion. The upper image is the actual stimulus—a static envelope with a translating pattern motion whose direction changes (i.e., rotates) at 1 Hz (Movies S4 and S5). Red arrows show changes in pattern motion direction. The vector sum model (middle image) predicts that the estimated object motion direction (black arrows) should be identical to the actual pattern motion direction (red arrows). The tracking model (bottom image) predicts that the estimated object motion direction should lead the physical pattern motion direction. In both models, object trajectory converges to a circular path. (B) Psychophysical data (Left) and object-tracking model predictions (Right) for the radius of the illusory object motion. The perceived motion radius increases with increasing eccentricity [F(2,8) = 26.8, P = 0.0003] or when the boundary of envelope is blurred [F(1,4) = 16.1, P = 0.016]. The same pattern is exhibited by the object-tracking model (as fitted in experiment 1). (C) Psychophysical data (Left) and object-tracking model predictions (Right) for the phase difference between the perceived object motion and the actual pattern motion direction. The phase difference decreases as the eccentricity increases [F(1,4) = 18.1, P = 0.013], with no significant effects of the boundary type [F(1,4) = 0.77, P = 0.43]. The object-tracking model closely matches the human data. We could not measure the phase difference for the hard boundary condition at 5° eccentricity because observers did not perceive illusory object motion rotation [the model predicts a very small rotation radius (0.07°), which is likely below the perceptual threshold].
Fig. 5.
Fig. 5.
A novel illusion predicted by the tracking model. (A) We added circular object motion (black arrows) to a stimulus whose pattern motion (red arrows) rotates in the opposite direction (orange arrows). (B) The tracking model predicts that the percept follows the object motion direction at near eccentricities and pattern motion direction at far eccentricities, i.e., the direction of perceived object rotation changes. Notably, at an intermediate eccentricity, the object is predicted to oscillate along a nearly linear trajectory. This prediction can be confirmed by viewing Movie S6.

References

    1. Burr D, Thompson P. Motion psychophysics: 1985–2010. Vision Res. 2011;51(13):1431–1456. - PubMed
    1. Challa S, Morelande MR, Musicki D, Evans RJ. Fundamentals of Object Tracking. Cambridge Univ Press; Cambridge, UK: 2011.
    1. De Valois RL, De Valois KK. Vernier acuity with stationary moving Gabors. Vision Res. 1991;31(9):1619–1626. - PubMed
    1. Ramachandran VS, Anstis SM. Illusory displacement of equiluminous kinetic edges. Perception. 1990;19(5):611–616. - PubMed
    1. Whitney D. The influence of visual motion on perceived position. Trends Cogn Sci. 2002;6(5):211–216. - PMC - PubMed

Publication types

LinkOut - more resources