Depth Perception Based on the Interaction of Binocular Disparity and Motion Parallax Cues in Three-Dimensional Space
- PMID: 40431963
- PMCID: PMC12115827
- DOI: 10.3390/s25103171
Depth Perception Based on the Interaction of Binocular Disparity and Motion Parallax Cues in Three-Dimensional Space
Abstract
Background and objectives: Depth perception of the human visual system in three-dimensional (3D) space plays an important role in human-computer interaction and artificial intelligence (AI) areas. It mainly employs binocular disparity and motion parallax cues. This study aims to systemically summarize the related studies about depth perception specified by these two cues.
Materials and methods: We conducted a literature investigation on related studies and summarized them from aspects like motivations, research trends, mechanisms, and interaction models of depth perception specified by these two cues.
Results: Development trends show that depth perception research has gradually evolved from early studies based on a single cue to quantitative studies based on the interaction between these two cues. Mechanisms of these two cues reveal that depth perception specified by the binocular disparity cue is mainly influenced by factors like spatial variation in disparity, viewing distance, the position of visual field (or retinal image) used, and interaction with other cues; whereas that specified by the motion parallax cue is affected by head movement and retinal image motion, interaction with other cues, and the observer's age. By integrating these two cues, several types of models for depth perception are summarized: the weak fusion (WF) model, the modified weak fusion (MWF) model, the strong fusion (SF) model, and the intrinsic constraint (IC) model. The merits and limitations of each model are analyzed and compared.
Conclusions: Based on this review, a clear picture of the study on depth perception specified by binocular disparity and motion parallax cues can be seen. Open research challenges and future directions are presented. In the future, it is necessary to explore methods for easier manipulating of depth cue signals in stereoscopic images and adopting deep learning-related methods to construct models and predict depths, to meet the increasing demand of human-computer interaction in complex 3D scenarios.
Keywords: 3D space; binocular disparity; depth perception; fusion models; human vision; human–computer interaction; motion parallax; virtual reality.
Conflict of interest statement
The authors declare no conflicts of interest.
Figures








Similar articles
-
Relating visual and pictorial space: Integration of binocular disparity and motion parallax.J Vis. 2024 Dec 2;24(13):7. doi: 10.1167/jov.24.13.7. J Vis. 2024. PMID: 39652056 Free PMC article.
-
The interaction of binocular disparity and motion parallax in the computation of depth.Vision Res. 1996 Nov;36(21):3457-68. doi: 10.1016/0042-6989(96)00072-7. Vision Res. 1996. PMID: 8977012
-
[Synthesis of binocular disparity with motion parallax in depth slant perception: statistical efficiency approach].Shinrigaku Kenkyu. 2002 Feb;72(6):498-507. doi: 10.4992/jjpsy.72.498. Shinrigaku Kenkyu. 2002. PMID: 11977844 Japanese.
-
The neural basis of depth perception from motion parallax.Philos Trans R Soc Lond B Biol Sci. 2016 Jun 19;371(1697):20150256. doi: 10.1098/rstb.2015.0256. Philos Trans R Soc Lond B Biol Sci. 2016. PMID: 27269599 Free PMC article. Review.
-
Binocular vision and motion-in-depth.Spat Vis. 2008;21(6):531-47. doi: 10.1163/156856808786451462. Spat Vis. 2008. PMID: 19017481 Review.
References
-
- Brenner E., Smeets J.B. Stevens’ Handbook of Experimental Psychology and Cognitive Neuroscience: Sensation, Perception, and Attention. Wiley; Hoboken, NJ, USA: 2018. Depth perception; pp. 385–414.
-
- Howard I.P., Rogers B.J. Perceiving in Depth, Volume 2: Stereoscopic Vision. OUP; Oxford, UK: 2012.
-
- Howard I.P. Perceiving in depth, Vol. 3: Other Mechanisms of Depth Perception. Oxford University Press; Oxford, UK: 2012.
Publication types
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources