Elsevier

Neuroscience Letters

Volume 452, Issue 3, 20 March 2009, Pages 219-223
Neuroscience Letters

Two cortical mechanisms support the integration of visual and auditory speech: A hypothesis and preliminary data

https://doi.org/10.1016/j.neulet.2009.01.060Get rights and content

Abstract

Visual speech (lip-reading) influences the perception of heard speech. The literature suggests at least two possible mechanisms for this influence: “direct” sensory–sensory interaction, whereby sensory signals from auditory and visual modalities are integrated directly, likely in the superior temporal sulcus, and “indirect” sensory–motor interaction, whereby visual speech is first mapped onto motor-speech representations in the frontal lobe, which in turn influences sensory perception via sensory–motor integration networks. We hypothesize that both mechanisms exist, and further that previous demonstrations of lip-reading functional activations in Broca's region and the posterior planum temporale reflect the sensory–motor mechanism. We tested one prediction of this hypothesis using fMRI. We assessed whether viewing visual speech (contrasted with facial gestures) activates the same network as a speech sensory–motor integration task (listen to and then silently rehearse speech). Both tasks activated locations within Broca's area, dorsal premotor cortex, and the posterior planum temporal (Spt), and focal regions of the STS, all of which have previously been implicated in sensory–motor integration for speech. This finding is consistent with the view that visual speech influences heard speech via sensory–motor networks. Lip-reading also activated a much wider network in the superior temporal lobe than the sensory–motor task, possibly reflecting a more direct cross-sensory integration network.

References (34)

  • L.E. Bernstein et al.

    Visual speech perception without primary auditory cortex activation

    Neuroreport

    (2002)
  • D.E. Callan et al.

    Neural processes underlying perceptual enhancement by visual speech gestures

    Neuroreport

    (2003)
  • G.A. Calvert et al.

    Activation of auditory cortex during silent lip-reading

    Science

    (1997)
  • G.A. Calvert et al.

    Reading speech from still and moving faces: the neural substrates of visible speech

    Journal of Cognitive Neuroscience

    (2003)
  • R. Campbell

    The processing of audio-visual speech: empirical and neural bases

    Philosophical Transactions of The Royal Society B

    (2008)
  • B. Dodd

    The role of vision in the perception of speech

    Perception

    (1977)
  • F. Fridriksson et al.

    Motor speech perception modulates the cortical language areas

    NeuroImage

    (2008)
  • Cited by (53)

    View all citing articles on Scopus
    View full text