Frame-Up. Focus on “Eye-Centered, Head-Centered, and Complex Coding of Visual and Auditory Targets in the Intraparietal Sulcus”
2005; American Physiological Society; Volume: 94; Issue: 4 Linguagem: Inglês
10.1152/jn.00466.2005
ISSN1522-1598
Autores Tópico(s)Visual perception and processing mechanisms
ResumoEDITORIAL FOCUSFrame-Up. Focus on "Eye-Centered, Head-Centered, and Complex Coding of Visual and Auditory Targets in the Intraparietal Sulcus"Lawrence H. SnyderLawrence H. SnyderPublished Online:01 Oct 2005https://doi.org/10.1152/jn.00466.2005MoreSectionsPDF (32 KB)Download PDF ToolsExport citationAdd to favoritesGet permissionsTrack citations ShareShare onFacebookTwitterLinkedInWeChat As neuroscientists, the world would be simpler if the frame of reference for spatial information in the brain reflected either the sensory apparatus from which the signals were derived or the motor apparatus toward which the signals were aimed. In this simple world, the neural correlate of a visual stimulus would reflect where on the retina the stimulus had appeared, while the neural correlate of an auditory stimulus would reflect where the sound source was relative to the ears. An area involved in coding arm movements would reflect the goal location relative to current arm position. Unfortunately, the world is not so simple. In this issue of the Journal of Neurophysiology (p. 2331–2352), Mullette-Gillman and colleagues (2005), recording from neurons on both banks of the intraparietal sulcus (IPS), show that individual neurons represent the spatial locations of visual and auditory stimuli similarly and that many neurons use idiosyncratic frames of reference that are neither head-centered nor eye-centered. Similar findings were just reported by Schlack and colleagues (2005) in the IPS fundus. These findings are part of a shift toward a new view of frames of reference in the brain.Work from the Knudsen laboratory supports the simple story. In the optic tectum of the owl, spatial information derived from auditory signals become aligned with visual information (Knudsen and Brainard 1991), effectively transforming visual and auditory signals into a common eye-centered frame of reference (see also Pouget et al. 2002). In this view, misalignments or deviations from a "pure" eye-centered frame of reference might be either measurement error or reflect neural noise, the inconsequential peculiarities of individual neurons that will be cancelled out at the population level.Work in mammals suggests a more complex picture. Neurons in monkey superior colliculus encode the visual stimulus location referenced to the eye, whereas sound stimuli are represented in an assortment of reference frames that are neither eye- nor head-centered but instead appear to reflect a nonsystematic compromise between the two (Jay and Sparks 1987). In cortical area VIP, tactile stimuli are represented in a primarily head-centered frame, whereas visual stimuli are referenced to the eyes, referenced to the head or lie in some intermediate frame (Duhamel et al. 1997). What is the utility of a representation based on an intermediate frame of reference? An intermediate frame of reference may reflect an intermediate stage in a reference frame transformation—auditory to eye-centered, in the colliculus, or visual to head-centered, in VIP. A similar explanation has been applied to the modulation of visual responses by eye position in area LIP (Andersen and Zipser 1988).The findings of Mullette-Gilman and colleagues (2005) and Schlack and colleagues (2005) point toward another interpretation. They show that between one-third and three-quarters of neurons on both banks and in the fundus of the IPS use intermediate frames of reference for visual and auditory stimuli. These remarkably large percentages are similar to those found by Groh and colleagues for auditory stimuli in the inferior colliculus and in auditory cortex (Groh et al. 2001; Werner-Reiss et al. 2003). The ubiquity of intermediate frames of reference, and the lack of a progressive shift toward one frame or another, suggests that these mixed frames do not reflect an intermediate stage in a reference frame transformation but rather an intentional coding scheme that is maintained across multiple brain areas and sensory modalities.The brain must not only convert information from one frame to another but also integrate noisy information arriving from different sensory systems (sensor fusion). Deneve and Pouget (2004) hypothesize that intermediate frames of reference may be a neural correlate of an arbitration process designed to reconcile and integrate these noisy input signals. In their recurrent neural network model, inputs from different sensory modalities "pull" internal representations toward one frame or another, resulting in nodes that use a mixture of intermediate frames of reference.The model of Deneve and Pouget is appealing but still does not explain why a mixture of intermediate frames of reference are maintained across many cortical and subcortical areas. Mullette-Gilman and colleagues suggest that mixed frames may occur because information from more than one frame of reference are of use to downstream structures (Klier et al. 2003). For example, directing gaze to a target requires not just eye-centered information but also head-centered information. Regardless of the explanation, these recent papers make it clear that we must understand intermediate reference frames in order to fully understand sensory to motor transformations in the brain.REFERENCESAndersen and Zipser 1988 Andersen RA and Zipser D. The role of the posterior parietal cortex in coordinate transformations for visual-motor integration. Can J Physiol Pharmacol 66: 488–501, 1988.Crossref | PubMed | ISI | Google ScholarDeneve and Pouget 2004 Deneve S and Pouget A. Bayesian multisensory integration and cross-modal spatial links. J Physiol 98: 249–258, 2004.Google ScholarDuhamel et al. 1997 Duhamel JR, Bremmer F, Ben Hamed S, and Graf W. Spatial invariance of visual receptive fields in parietal cortex neurons. Nature 389: 845–848, 1997.Crossref | PubMed | ISI | Google ScholarGroh et al. 2001 Groh JM, Trause AS, Underhill AM, Clark KR, and Inati S. Eye position influences auditory responses in primate inferior colliculus. Neuron 29: 509–518, 2001.Crossref | PubMed | ISI | Google ScholarJay and Sparks 1987 Jay MF and Sparks DL. Sensorimotor integration in the primate superior colliculus. I. Motor convergence. J Neurophysiol 57: 22–34, 1987.Link | ISI | Google ScholarKlier et al. 2003 Klier EM, Wang H, and Crawford JD. Three-dimensional eye-head coordination is implemented downstream from the superior colliculus. J Neurophysiol 89: 2839–2853, 2003.Link | ISI | Google ScholarKnudsen and Brainard 1991 Knudsen EI and Brainard MS. Visual instruction of the neural map of auditory space in the developing optic tectum. Science 253: 85–87, 1991.Crossref | PubMed | ISI | Google ScholarMullette-Gillman et al. 2005 Mullette-Gillman OA, Cohen YE, and Groh JM. Eye-centered, head-centered, and complex coding of visual and auditory targets in the intraparietal sulcus. J Neurophysiol 94: 2331–2352, 2005.Link | ISI | Google ScholarPouget et al. 2002 Pouget A, Ducom JC, Torri J, and Bavelier D. Multisensory spatial representations in eye-centered coordinates for reaching. Cognition 83: B1–B11, 2002.Crossref | PubMed | ISI | Google ScholarSchlack et al. 2005 Schlack A, Sterbing-D'Angelo SJ, Hartung K, Hoffman K-P, and Bremmer F. Multisensory space representations in the macaque ventral intraparietal area. J Neurosci 25: 4616–4625, 2005.Crossref | PubMed | ISI | Google ScholarWerner-Reiss et al. 2003 Werner-Reiss U, Kelly KA, Trause AS, Underhill AM, and Groh JM. Eye position affects activity in primary auditory cortex of primates. Curr Biol 13: 554–562, 2003.Crossref | PubMed | ISI | Google ScholarAUTHOR NOTESAddress for correspondence: Anatomy and Neurobiology, Box 8108, Washington University School of Medicine, 660 S. Euclid Ave., St. Louis, MO 63110 (E-mail: [email protected]) Download PDF Back to Top Next FiguresReferencesRelatedInformationCited ByVisual–Motor Transformations Within Frontal Eye Fields During Head-Unrestrained Gaze Shifts in the Monkey9 December 2014 | Cerebral Cortex, Vol. 25, No. 10Frame of Reference for Describing Space in Defining the "Straight Ahead" Position in Humans22 September 2011 | Neuroscience and Behavioral Physiology, Vol. 41, No. 7Motor-Related Signals in the Intraparietal Cortex Encode Locations in a Hybrid, rather than Eye-Centered Reference Frame9 December 2008 | Cerebral Cortex, Vol. 19, No. 8Multisensory integration: current issues from the perspective of the single neuron1 April 2008 | Nature Reviews Neuroscience, Vol. 9, No. 4 More from this issue > Volume 94Issue 4October 2005Pages 2259-2260 Copyright & PermissionsCopyright © 2005 by the American Physiological Societyhttps://doi.org/10.1152/jn.00466.2005PubMed16160086History Published online 1 October 2005 Published in print 1 October 2005 Metrics
Referência(s)