Beyond the window: multisensory representation of peripersonal space across a transparent barrier

https://doi.org/10.1016/S0167-8760(03)00124-7Get rights and content

Abstract

A large body of neuropsychological evidence has been recently provided showing that humans can code visual objects in nearby space through multisensory visuo–tactile integrative processes, which share several similarities with the functional properties of bimodal neurons documented in neurophysiological studies. In particular, the phenomenon of visuo–tactile extinction reveals that crossmodal integration may take place in a privileged manner within a limited sector of space closely surrounding the body surface, i.e. in the near peripersonal space. Here we report that visuo–tactile extinction can seemingly be obtained when a physical, transparent barrier is interposed between the patients’ hand and a proximal visual stimulus. These findings show that visuo–tactile representation of peripersonal space can be formed despite the subject's explicit awareness concerning the physical impossibility for the hand to be touched. This phenomenon indicates that multisensory integrative processing can occur in a bottom-up fashion without necessarily being modulated by more ‘cognitive’ processes. Such integration may be functionally important for automatic reactions such as head turning or hand withdrawal.

Introduction

Imagine you are quietly observing the sunset, looking out of a window in the countryside. Suddenly, a bee flies towards your hand from outside. After a few milliseconds, you might be quite disappointed and wonder why you flinched and withdrew your hand, since you knew perfectly well that the bee had no chance of stinging your hand because a solid barrier, the window pane, was there to protect it. In several species including humans, avoidance reactions such as hand withdrawal or head turning are triggered by images of objects that grow in size and are present in infants as young as 1 week (Ball et al., 1983, Dunkeld and Bower, 1980). Escaping behaviour can be evoked in humans even if the object is projected on a screen, thus having no chances of touching the subject (King et al., 1992).

In evolutionary terms, visually based detection of nearby objects would be very useful to prepare appropriate motor reactions. Cues about direction of motion in depth and time to collision can be used to successfully avoid or achieve collision (Regan and Gray, 2000), and collision-sensitive neurons triggering avoidance movements have been reported in the pigeon and locust (Rind and Simmons, 1999). In higher vertebrates, detection of nearby objects can be derived by integrating multiple sources of sensory information (Spence and Driver, 2002). An ever-growing body of evidence supports the notion that visual space surrounding the body (peripersonal space) in the monkey is coded through multisensory integration at the single-neuron level (Duhamel et al., 1991, Duhamel et al., 1998, Graziano and Gross, 1995, Graziano and Gross, 1998, Rizzolatti et al., 1981, Rizzolatti et al., 1998). The putamen, some parietal and premotor areas contain multisensory neurons with tactile and visual receptive fields (RF) whose locations are roughly matched in space. These neurons respond both to cutaneous and visual stimuli presented close to a given body part (e.g. the head or the hand) where the tactile RF is located, and their visual responses decrease at greater distances. In addition, bimodal neurons operate to some degree in body part centred co-ordinates, in that the visual RF remains spatially anchored to the tactile RF when this is moved. Owing to these functional properties, it has been suggested that premotor cortex, parietal areas and putamen form an interconnected system for multisensory coding of near peripersonal space centred on body parts (Colby et al., 1993, Duhamel et al., 1998, Fogassi et al., 1996, Fogassi et al., 1999, Graziano et al., 1997).

By investigating patients with tactile extinction, we provided the first evidence that the human brain integrates visual–tactile information for representing the peripersonal space. Extinction patients (Loeb, 1885, Oppenheim, 1885) usually fail to report a contralesional stimulus when a concurrent stimulus is presented ipsilesionally and typically within the same sensory modality. Beyond the presence of some sensory deficits (Eimer et al., 2002, Làdavas, 1990), the phenomenon of extinction results from uneven competition between spared and affected spatial representations, which benefit from stronger and weaker competitive weights, respectively (di Pellegrino and De Renzi, 1995, di Pellegrino et al., 1997a, Driver et al., 1997, Duncan, 1980, Duncan, 1996, Ward et al., 1994). In the domain of visual–tactile extinction (Bender, 1952, Mattingley et al., 1997), we revealed the spatial specificity of crossmodal effects (di Pellegrino et al., 1997b, Làdavas et al., 1998a, Làdavas et al., 1998b) by showing that a visual stimulus presented near the patient's ipsilesional hand or face (i.e. near peripersonal space) strongly inhibited the awareness of a tactile stimulus delivered on the contralesional hand or side of the face (crossmodal extinction). Crucially, less inhibition was produced by a visual stimulus presented far from the patient's hand or face (i.e. far peripersonal space). As we reasoned elsewhere (for an authoritative review see Làdavas, 2002) this pattern of results is consistent with the existence, in humans, of an integrated visual–tactile system coding near peripersonal space similar to that described in monkeys. Owing to this sensory integration, a visual stimulus could strongly activate the somatosensory representation of a body part when presented close to it, whereas weaker activation would be produced by stimuli presented at farther distances. Thus, the strongest crossmodal extinction will emerge from the simultaneous activation of somatosensory representation of the left hand by a tactile stimulus, and of the right hand by a proximal visual stimulus.

Vision is more relevant than proprioception for the representation of near peripersonal space (Làdavas et al., 2000), although they provide complementary information (Haggard et al., 2000, Rossetti et al., 1995, van Beers et al., 1999). Consistent with the fact that the proprioceptive response of monkey's bimodal neurons is weaker than that evoked when vision of the arm is allowed (Graziano, 1999, Graziano et al., 2000, MacKay and Crammond, 1987, Obayashi et al., 2000), we showed that visual–tactile processing in peripersonal space may rely uniquely upon visual cues about hand position. In patients with tactile extinction (Farnè et al., 2000), we presented visual stimuli far from their ipsilesional hand, which was placed behind their back, but near a rubber hand that could be either visually aligned or misaligned with the patients’ ipsilesional shoulder. Unseen tactile stimuli were concurrently delivered on their contralesional hand. The results showed that the visual stimulus presented near the visible right rubber hand induced strong crossmodal visuo–tactile extinction, comparable to that obtained by proximal visual stimulation of the patients’ real hand. Critically, crossmodal extinction was reduced when the rubber hand could not be visually attributed to the patients’ body (being misaligned with respect to their shoulder). The vision of a fake hand deceived the multisensory system coding peripersonal space, such that a visual stimulus presented near a rubber hand was processed as if it were near the real hand (see also Pavani et al., 2000). However, this phenomenon can take place only if the fake hand looks plausible as a personal body part, showing that visual dominance is not complete. In most of everyday activities both the felt and seen positions of the hand are actually congruent. Thus, the deception operated by a rubber hand may reflect a sort of impenetrability of the integrated visual–tactile system to discrepant information provided by proprioception and vision. In addition, this impenetrability is present despite the subject's conscious awareness concerning the actual discrepancy between the senses.

Recent brain imaging studies revealed that activation of the primary and secondary somatosensory cortex, similar to that obtained following tactile stimulation, can be induced when no touch is actually delivered, but rather when the subject is waiting for it (Carlsson et al., 2000). The common pattern of cerebral activity during real somatosensory stimulation and anticipation supports the role of top-down mechanisms for the tuning of sensory processing (Carlsson et al., 2000, Drevets et al., 1995, Roland, 1981). Top-down modulations have been reported in motor, visual, as well as somatosensory imagery studies, whereby subjects are just imagining the skin sensation, without anticipation of any real stimulus (Hodge et al., 1998, Kosslyn et al., 1997, Roth et al., 1996).

These findings raise the question of whether multisensory processing of the space near the hand can be mediated by the expectancy of being touched. If this were true, crossmodal effects in the peripersonal space would be impeded when the hand and the proximal visual objects are physically separated, thereby preventing any top-down somatosensory expectancy. Thus, here we investigated whether strong crossmodal extinction can still be observed despite the subjects’ conscious awareness concerning the impossibility for their hand to be touched. Note that in the above reported rubber hand study (Farnè et al., 2000) patients knew that the experimenter's finger would not touch their own right hand, since the former was located near the rubber hand, while the latter was behind their back. Nonetheless, strong crossmodal extinction was obtained through the illusory self-attribution of the rubber hand (see also Botvinick and Cohen, 1998, Pavani et al., 2000, Rorden et al., 1999). However, this implicit self-attribution leaves open the possibility that crossmodal extinction was due to visually induced tactile expectancy; that is, patients may have expected to feel a touch from the self-attributed rubber hand that was located close to, but not physically protected from, the visual stimulation. Therefore, in the present study we investigated directly whether crossmodal effects are dependent or not upon the situation-related expectancy of being touched. To this aim, the perceptual performances of nine right hemisphere brain-damaged patients with left tactile extinction were examined by using a crossmodal visuo–tactile stimulation paradigm. The experimenter's hand was placed either near or far from the patient's right hand, which could be screened or not by a transparent Plexiglas. On the basis of the above reported impenetrability of the multisensory system, we hypothesised that this system would process proximal visual stimuli as being near the body independent of their actual possibility of touching it, thus being somewhat resistant to more cognitive, top-down processing. This, in turn, might be the basis of the representation of peripersonal space that allows the kind of ‘unmotivated’ avoidance reactions cited above.

Section snippets

Subjects

A group of nine neurological patients gave their informed consent to participate in the study, which was approved by the local ethical committee. All patients were right-handed and suffered a right unilateral lesion due to a cerebro-vascular accident, as confirmed by CT or MRI scan. Table 1 illustrates the anatomical areas involved by the lesion from five patients, according to the methods of Damasio and Damasio (1989). The medical file of one patient (S.S.) was not available. For the three

Unimodal tactile extinction

Patients performed well on catch trials, producing very few false alarms (only two ‘left’ responses were made by L.E.). The mean accuracy in reporting touches on the left hand, as a function of single and double simultaneous stimulation, was computed in percentage for all patients. A one-way ANOVA with stimulation (unilateral, bilateral) as within-subject factor was significant [F(l, 8)=137.29, P<0.0001], confirming that patients were affected by a rather severe form of left tactile extinction.

Discussion

The study asked whether visuo–tactile extinction is affected by the patients’ top-down knowledge of the possibility for the visual stimulus to reach their own hand. To this aim, the performance of left tactile extinction patients was investigated when their right hand was either covered or not by a transparent Plexiglas. Three main findings were obtained. First, visual stimuli presented near the ipsilesional hand induced strong crossmodal extinction of contralesional tactile stimuli. Crucially,

Acknowledgements

We wish to thank all the patients for volunteering to participate in this study. We thank Tony Marcel for stimulating discussions on the topic addressed in this study. We also thank Francesca Frassinetti for reading CT and MRI scans and lesions reconstruction. This work was supported by a MIUR grant to E.L.

References (62)

  • F.C. Rind et al.

    Seeing what is coming: building collision-sensitive neurones

    Trends Neorosci.

    (1999)
  • G. Rizzolatti et al.

    The organization of the cortical motor system: new concepts

    Electroencephalogr. Clin. Neurophysiol.

    (1998)
  • G. Rizzolatti et al.

    Afferent properties of periarcuate neurons in macaque monkeys. II. Visual responses

    Behav. Brain Res.

    (1981)
  • W.A. Ball et al.

    Stimulus dimensionality and infants’ perceived movement in depth

    J. Genet. Psychol.

    (1983)
  • M.B. Bender

    Disorders in Perception

    (1952)
  • S.J. Blakemore et al.

    Central cancellation of self-produced tickle sensation

    Nat. Neurosci.

    (1998)
  • S.J. Blakemore et al.

    Spatio-temporal prediction modulates the perception of self-produced stimuli

    J. Cog. Neurosci.

    (1999)
  • M. Botvinick et al.

    Rubber hands ‘feel’ touch that eyes see

    Nature

    (1998)
  • K. Carlsson et al.

    Tickling expectations: neural processing in anticipation of a sensory stimulus

    J. Cog. Neurosci.

    (2000)
  • C.L. Colby et al.

    Ventral intraparietal area of the macaque: anatomic location and visual response properties

    J. Neurophysiol.

    (1993)
  • H. Damasio et al.

    Lesion analysis in neuropsychology

    (1989)
  • D. Denny-Brown et al.

    The significance of perceptual rivalry resulting from parietal lesions

    Brain

    (1952)
  • G. di Pellegrino et al.

    Spatial extinction to double asynchronous stimulation

    Neuropsychologia

    (1997)
  • G. di Pellegrino et al.

    Seeing where your hands are

    Nature

    (1997)
  • W.K. Dong et al.

    Somatosensory, multisensory, and task-related neurons in cortical area 7b (PF) of unanesthetized monkeys

    J. Neurophysiol.

    (1994)
  • W.C. Drevets et al.

    Blood flow changes in human somatosensory cortex during anticipated stimulation

    Nature

    (1995)
  • J. Driver et al.

    Extinction as a paradigm measure of attentional bias and restricted capacity following brain injury

  • J.R. Duhamel et al.

    Congruent representation of visual and somatosensory space in single neurons of monkey ventral intra-parietal area (VIP)

  • J.R. Duhamel et al.

    Ventral intraparietal area of the macaque: congruent visual and somatic response properties

    J. Neurophysiol.

    (1998)
  • J. Duncan

    The locus of interference in the perception of simultaneous stimuli

    Psychol. Rev.

    (1980)
  • J. Duncan

    Coordinated brain systems in selective perception and action

  • Cited by (33)

    • An Action Field Theory of Peripersonal Space

      2018, Trends in Cognitive Sciences
      Citation Excerpt :

      In fact, many of these measures represent neural responses that are steps towards the ultimate selection of actions, and these steps, by definition, only take into account partial amounts of information, a principle neatly summarised in the affordance competition hypothesis [71]. Furthermore, defensive responses that rely mostly on early stages of processing offer a clear survival advantage: they do not need to integrate high-level information and, thus, can be more quickly and effectively enacted, at the cost of possibly having performed a useless action [81]. Intuitive examples of this point are common.

    • Development of space perception in relation to the maturation of the motor system in infant rhesus macaques (Macaca mulatta)

      2015, Neuropsychologia
      Citation Excerpt :

      Recent behavioral and neuroimaging studies with healthy individuals suggest the presence of a functionally homologous space coding system in humans (Bremmer et al., 2001; Holmes and Spence, 2004; Macaluso and Maravita, 2010; Pavani and Castiello, 2004; Spence et al., 2000; Spence et al., 2004). Moreover, there are reports of brain-damaged patients with specific impairments in detecting information within peripersonal space, but not extrapersonal space, or vice versa in extrapersonal space (Brozzoli et al., 2006; Cowey et al., 1994; Farnè et al., 2003; Làdavas and Farnè, 2004, 2006; Vuilleumier et al., 1998). Studies in monkeys have shown that several parietal–premotor circuits work in parallel supporting sensorimotor transformation to control arm-hand movements in space.

    • Multisensory interactions in the depth plane in front and rear space: A review

      2015, Neuropsychologia
      Citation Excerpt :

      In humans, however, studies of the influence of a barrier between stimuli from different modalities on multisensory interactions have shown different results. For example, Farnè et al. (2003) reported that the strength of visuotactile extinction in brain-damaged patients was not modulated by the presence of a transparent barrier between the visual and tactile stimuli. Similarly, the size of the visuotactile crossmodal congruency effect was not found to be influenced by the presence of a transparent barrier between the visual and tactile stimuli in Kitagawa and Spence's (2005) study using the crossmodal congruency task.

    • The robot hand illusion: Inducing proprioceptive drift through visuo-motor congruency

      2015, Neuropsychologia
      Citation Excerpt :

      Furthermore, the body shapes the representation of external space which is not processed in a uniform fashion (Valdés-Conroy et al., 2014). In particular peripersonal space (PPS), extending near the body, benefits from rich properties of multisensory integration (Farnè 2003; Maravita et al., 2002a, 2002b) in order to plan actions for manipulative (Brozzoli et al., 2011) or defensive purposes (Cooke and Graziano 2003; Rossetti et al., 2014). The PPS is critically shaped by the location of body parts and their intrinsic motor properties for action (Makin et al., 2008).

    View all citing articles on Scopus
    View full text