Efficient inverse graphics in biological face processing
- PMID: 32181338
- PMCID: PMC7056304
- DOI: 10.1126/sciadv.aax5979
Efficient inverse graphics in biological face processing
Abstract
Vision not only detects and recognizes objects, but performs rich inferences about the underlying scene structure that causes the patterns of light we see. Inverting generative models, or "analysis-by-synthesis", presents a possible solution, but its mechanistic implementations have typically been too slow for online perception, and their mapping to neural circuits remains unclear. Here we present a neurally plausible efficient inverse graphics model and test it in the domain of face recognition. The model is based on a deep neural network that learns to invert a three-dimensional face graphics program in a single fast feedforward pass. It explains human behavior qualitatively and quantitatively, including the classic "hollow face" illusion, and it maps directly onto a specialized face-processing circuit in the primate brain. The model fits both behavioral and neural data better than state-of-the-art computer vision models, and suggests an interpretable reverse-engineering account of how the brain transforms images into percepts.
Copyright © 2020 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works. Distributed under a Creative Commons Attribution NonCommercial License 4.0 (CC BY-NC).
Figures






References
-
- B. A. Olshausen, Perception as an inference problem, in The Cognitive Neurosciences, M. Gazzaniga, R. Mangun, Eds. (MIT Press, 2013).
-
- Yuille A., Kersten D., Vision as Bayesian inference: Analysis by synthesis? Trends Cogn. Sci. 10, 301–308 (2006). - PubMed
-
- H. Barrow, J. Tenenbaum, Recovering intrinsic scene characteristics from images, in Computer Vision Systems (Elsevier, 1978), p. 2.
-
- Lee T. S., Mumford D., Hierarchical bayesian inference in the visual cortex. J. Opt. Soc. Am. A 20, 1434–1448 (2003). - PubMed
Publication types
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources