Trends in Neurosciences
Multisensory spatial interactions: a window onto functional integration in the human brain
Introduction
A common theme throughout much research in neuroscience and neuropsychology has been ‘functional specialization’: that distinct cognitive, sensory and motor functions can be localized in distinct areas of the brain [1]. Functional neuroimaging now allows confirmatory and exploratory mapping of many such functions in the human brain. A simple example is identification of sensory-specific areas, responding to stimulation in one but not another sensory modality. Sensory-specific cortices can readily be localized using functional magnetic resonance imaging (fMRI) in humans, to regions in occipital cortex for vision, to regions in and around the superior temporal gyrus for audition, and to regions in post-central cortex for touch [2] (Figure 1a,i).
Psychological studies have shown for many years that stimuli in different sensory modalities can powerfully interact under some circumstances, to determine perception or behaviour. Textbook examples include crossmodal illusions such as the ventriloquist effect (perceived location of a sound can shift towards its apparent visual source, as when watching a movie at the cinema) and the McGurk effect (whereby a seen lip-movement can change how a concurrent speech-sound is heard) [3]. Although such illusions can arise with anomalous or incongruent multisensory combinations, many further crossmodal effects indicate the impact of multisensory congruence. For instance, visual detection can be enhanced at the location of a sound 4, 5, and there are many further examples of crossmodal links in spatial attention [6]. In the real world, signals in different modalities from a common external event or object will often be spatially and temporally aligned, and multisensory integration appears subject to corresponding spatial and temporal constraints [7].
The neural basis of multisensory interactions has been studied using both intra-cranial recordings (mainly in animals) and non-invasive electrophysiological or haemodynamic measures in humans. Pioneering animal studies focused on multisensory interactions that can arise owing to converging feedforward projections from sensory-specific structures to heteromodal areas. Neurons in the latter areas respond to stimuli in more than one modality, enabling multisensory interactions to occur at the single-cell level. Such multisensory neurons have now been discovered using single-cell recording in many cortical and sub-cortical regions. Cortical multisensory regions include numerous areas in parietal cortex (e.g. the ventral intraparietal area, VIP), temporal cortex (e.g. the caudal superior temporal polysensory region, cSTP) and frontal cortex (e.g. ventral premotor cortex, vPM) [8]. Subcortical regions include the superior colliculus (thoroughly investigated as a model case of multisensory interactions) [9], the basal ganglia and the putamen [9]. Such ‘functional specialization’ (here, localized neural selectivity for stimulation in multiple modalities or in just one modality) might provide one mechanism for ‘functional integration’ of information from different modalities (here, arising in multisensory areas, via feedforward convergence from unimodal areas). Indeed, this has been the standard assumption in research on multisensory integration for many years. But recent fMRI evidence from the human brain suggests that although feedforward convergence from unimodal to heteromodal regions is part of the story, it is by no means the whole story.
Section snippets
Anatomical convergence as one mechanism for multisensory processing in humans
Perhaps the simplest approach for uncovering candidate multisensory areas in the human brain using neuroimaging is to measure brain activity during stimulation of particular modalities, and then determine whether any regions respond to stimulation of more than one modality. Using visual, auditory and tactile stimuli, Bremmer and colleagues [2] (Figure 1a,ii, white squares) identified multisensory responses in this way for the intraparietal sulcus (IPS), inferior parietal lobule (IPL) and vPM.
Multisensory interactions between concurrent stimuli in different modalities
Work such as that outlined in the preceding sections can provide initial evidence for candidate multisensory areas in humans, as established in other species. But human neuroimaging studies of the type described so far stimulated only one modality at a time. Two further crucial steps are to examine how the brain responds when several modalities are stimulated in combination, and how this might depend on the spatial and temporal relationships between the stimuli 6, 7, 13.
Behavioural and
Beyond convergence to multimodal brain regions: multisensory interactions can affect ‘unimodal’ brain regions
In several recent fMRI studies (e.g. Refs 15, 16), the relative location of concurrent visual and tactile stimuli has been varied. On a trial-by-trial basis, subjects saw a brief flash in the left or right hemifield near one hand. Unpredictably, on half the trials this visual stimulus was combined with a synchronous unseen vibration to one hand (left or right hand in separate experiments). Thus, visual and tactile stimuli could be presented in either multisensory spatial congruence or
Possible mechanisms for crossmodal effects on ‘unimodal’ sensory-specific cortices
The findings of crossmodal effects on sensory-specific cortices or sensory-specific ERP components (Figure 2 and Box 2) raise the question of how information concerning one modality could reach brain regions dedicated primarily to processing a different modality. Two contrasting proposals will now be considered, although these need not be mutually exclusive. One involves newly discovered direct connections between sensory-specific areas for different modalities; the other concerns ‘top-down’
Concluding remarks
Although signals in different sensory modalities are initially processed in different brain regions (functional segregation), these signals can also be combined and can interact crossmodally (providing examples of functional integration). Here, we have reviewed recent neuroimaging findings on spatial crossmodal interactions in the human brain. These indicate that feedforward convergence from lower-level, sensory-specific areas to higher-order, heteromodal areas might not be the only mechanism
Acknowledgements
Our thanks to Chris Frith and Martin Eimer for the past and ongoing collaboration on the work reported here. E.M. is supported by the Italian Ministry of Health and Telecom Italia Mobile. J.D. is supported by the Wellcome Trust, and was previously supported by the Medical Research Council (UK).
References (52)
Polymodal motion processing in posterior parietal and premotor cortex: a human fMRI study strongly implies equivalencies between humans and monkeys
Neuron
(2001)- et al.
Spatial attention and crossmodal interactions between vision and touch
Neuropsychologia
(2001) The cortical control of movement revisited
Neuron
(2002)Crossmodal spatial influences of touch on extrastriate visual areas take current gaze-direction into account
Neuron
(2002)- et al.
Crossmodal links in endogenous and exogenous spatial attention: evidence from event-related brain potential studies
Neurosci. Biobehav. Rev.
(2001) - et al.
Feeling with the mind's eye: contribution of visual cortex to tactile perception
Behav. Brain Res.
(2002) Multisensory auditory-somatosensory interactions in early cortical processing revealed by high-density electrical mapping
Brain Res. Cogn. Brain Res.
(2000)Multisensory auditory–visual interactions during early sensory processing in humans: a high-density electrical mapping study
Brain Res. Cogn. Brain Res.
(2002)An analysis of audio-visual crossmodal integration by means of event-related potential (ERP) recordings
Brain Res. Cogn. Brain Res.
(2002)Delayed striate cortical activation during spatial attention
Neuron
(2002)
The neural basis of biased competition in human visual cortex
Neuropsychologia
Modality-specific control of strategic spatial attention in parietal cortex
Neuron
Increased activity in human visual cortex during directed attention in the absence of visual stimulation
Neuron
Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex
Curr. Biol.
Spatial and temporal factors during processing of audiovisual speech: a PET study
NeuroImage
The New Cognitive Neurosciences
The psychology of multimodal perception
Involuntary orienting to sound improves visual perception
Nature
Enhancement of visual perception by crossmodal visuo–auditory interaction
Exp. Brain Res.
Crossmodal Space and Crossmodal Attention
The Handbook of Multisensory Processing
The representation of extrapersonal space: a possible role for bimodal, visuo–tactile neurons
The Merging of the Senses
A system of multimodal areas in the primate brain
That's my hand! Activity in premotor cortex reflects feeling of ownership of a limb
Science
Crossmodal spatial interactions in subcortical and cortical circuits
Cited by (334)
Primary somatosensory cortex organization for engineering artificial somatosensation
2024, Neuroscience ResearchPrecision control for a flexible body representation
2022, Neuroscience and Biobehavioral Reviews