Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 1997 Nov 11;94(23):12649-54.
doi: 10.1073/pnas.94.23.12649.

Deciphering a neural code for vision

Affiliations

Deciphering a neural code for vision

C Passaglia et al. Proc Natl Acad Sci U S A. .

Abstract

Deciphering the information that eyes, ears, and other sensory organs transmit to the brain is important for understanding the neural basis of behavior. Recordings from single sensory nerve cells have yielded useful insights, but single neurons generally do not mediate behavior; networks of neurons do. Monitoring the activity of all cells in a neural network of a behaving animal, however, is not yet possible. Taking an alternative approach, we used a realistic cell-based model to compute the ensemble of neural activity generated by one sensory organ, the lateral eye of the horseshoe crab, Limulus polyphemus. We studied how the neural network of this eye encodes natural scenes by presenting to the model movies recorded with a video camera mounted above the eye of an animal that was exploring its underwater habitat. Model predictions were confirmed by simultaneously recording responses from single optic nerve fibers of the same animal. We report here that the eye transmits to the brain robust "neural images" of objects having the size, contrast, and motion of potential mates. The neural code for such objects is not found in ambiguous messages of individual optic nerve fibers but rather in patterns of coherent activity that extend over small ensembles of nerve fibers and are bound together by stimulus motion. Integrative properties of neurons in the first synaptic layer of the brain appear well suited to detecting the patterns of coherent activity. Neural coding by this relatively simple eye helps explain how horseshoe crabs find mates and may lead to a better understanding of how more complex sensory organs process information.

PubMed Disclaimer

Figures

Figure 1
Figure 1
(a) A horseshoe crab, Limulus polyphemus, at the water‘s edge mounted with a video camera, CrabCam, for recording underwater movies and a microsuction electrode for recording responses from a single optic nerve fiber. The barrel of the electrode protrudes to the right from the recording chamber, which is sealed with a white cap (2.5 cm diameter). Cables lead the video and optic nerve signals to recording electronics located on an overhead skiff as the animal moves about underwater at depths of 0.5 to 1 m. (b) CrabCam images of a high-contrast black object (Left) and a low-contrast gray object (Right). Black disks indicate the fields of view of the ommatidia whose optic nerve responses were recorded with the microsuction electrode. (c) Light intensities (given as contrast) incident on the recorded ommatidia plotted as a function of time as the animal moved past the black (Left) and gray (Right) objects. Contrast is the ratio of the intensity of a pixel to the average intensity of the scene. (d) Recordings with the microsuction electrode of spike trains from single fibers in response to the 10-s sequences of light intensities shown in c. (e) Recorded trains of spikes in d plotted as instantaneous frequencies, which are the reciprocals of the intervals between nerve impulses. (f) Instantaneous frequencies computed by the cell-based model in response to digitized video images of the black and gray objects. The objects passed through the fields of view of the recorded and model neurons from 4 to 6 s after the start of the runs evoking reduced (black object) and quasi-periodic (gray object) responses. Responses are representative of those recorded under similar conditions in other field experiments (n = 53 trials).
Figure 2
Figure 2
(a) CrabCam images of the black and gray objects after optical sampling. The arrays of pixels show the light intensities incident on the 12 × 16 array of ommatidia viewing the videotaped scene. Rows 1 and 2 show images of the black object while the animal was moving and stationary, respectively. Rows 3 and 4 show corresponding images of the gray object. Interval between displayed sequences is 0.25 s. (b) Computed neural images of the optically transformed visual images in a. The arrays of pixels give the computed firing rates of optic nerve fibers mapped onto a gray scale with black set to 0 ips and white set to twice the mean firing rate. Mean firing rates to the uniform background illumination in rows 1, 2, 3, and 4 were 12, 20, 13, and 18 ips, respectively. (c) Computed neural images of activity in the brain generated by the eye’s output in b. The arrays of pixels give the simulated responses of brain neurons having integration times of 400 ms mapped onto a gray scale with black set to 0 and white set to twice the mean. Each image sums eight sequential neural images from the eye. Computed images are based on the temporal properties of brain neurons and not their spatial interactions. (d) Simulated responses of brain neurons evoked by the activities of an equal number of ommatidia located in a horizontal row of the model eye (dashed lines in c). Movies of the underwater video recordings and their computed neural images can be viewed at Center for Vision Research at http://www.hscsyr.edu/~eye.
Figure 3
Figure 3
Temporal integration of optic-nerve responses by neurons in the first synaptic layer of the brain. The gray and black symbols plot the signal-to-noise ratio (SNR) of simulated responses of brain neurons to moving images of the gray and black objects in Fig. 2, respectively, as a function of synaptic integration time. SNR is defined as the peak-to-peak response modulation of neurons viewing the objects relative to twice the standard deviation of response modulations of the rest of the network.
Figure 4
Figure 4
Spatiotemporal response properties of the lateral eye. (a) Spatial transfer function. Gain of the response of a single optic nerve fiber plotted as a function of the spatial frequency of a drifting sinusoidal grating. Negative slope lines denote the frequency band corresponding to crab-size objects at the range of distances which Limulus detect potential mates in their natural habitat (0.25–1.4 m). (b) Temporal transfer function. Response gain plotted as a function of the frequency of a flickering spot that fills the field of view of the recorded ommatidium. Positive slope lines denote the frequency band corresponding to animal velocities over the range of distances of mate detection (0.5–6 Hz). Negative slope lines give the frequency range of strobic lighting (2–6 Hz) in the animal’s natural underwater environment.
Figure 5
Figure 5
Comparison of the visual performance of horseshoe crabs to that predicted from the computational model. Black symbols replot from a behavioral study (13) the probability of crabs turning toward a black crab-size object as a function of the distance from the object. Gray symbols plot the visual performance of an ideal horseshoe crab predicted from optic nerve responses computed by the model. We assume that the ideal horseshoe crab moved past the object at an average speed of 15 cm/s and water turbidity decreased the object’s contrast with distance (see Materials and Methods). We use signal detection theory to estimate the probability of detecting the computed optic nerve responses in the presence of noise. A coefficient of variation of 0.1 in firing rate noise and a threshold for detection permitting a 1% false-alarm rate provided the best description of the measured visual performance. The higher coefficient of variation measured in the field (0.2) suggests that the decision maker in the crab brain pools the activities of a small cluster of optic nerve fibers (≈4), which is consistent with the number of ommatidia viewing the object at behavioral threshold (13, 14). Significantly different estimates of visual performance require at least a 10-fold change in detection threshold or a 2-fold change in noise level.

Similar articles

Cited by

References

    1. Dowling J E. The Retina: An Approachable Part of the Brain. Cambridge, MA: Harvard Univ. Press; 1987.
    1. Rieke F, Warland D, DeRuyter van Steveninck R R, Bialek W. Spikes: Exploring the Neural Code. Cambridge, MA: MIT Press; 1997.
    1. Bialek W, Owen W G. Biophys J. 1990;58:1227–1233. - PMC - PubMed
    1. Lettvin Y L, Maturana H R, McColluch W S, Pitts W H. Proc Inst Radio Eng. 1959;47:1940–1951.
    1. Levine J S, MacNichol E F. Sens Processes. 1979;3:95–131. - PubMed

Publication types

LinkOut - more resources