Functional MR imaging exposes differential brain responses to syntax and prosody during auditory sentence comprehension
Introduction
Since several neurologists in the 19th century (Paul Broca, Marc Dax, Carl Wernicke) claimed its involvement in language functions, the brain's left hemisphere has been viewed as the language dominant one. Their observations indicated that lesions in anterior and posterior parts of the left peri-sylvian region1 caused severe deficits in producing and comprehending speech. To date, most neurological textbooks still favour the traditional view which holds that language functions reside in Broca's and Wernicke's area (Stowe, Haverkort, & Zwarts, 2003). Originally, Broca's area2 was depicted as subserving speech production exclusively whereas Wernicke's area3 was assigned to aspects of speech comprehension. However, in the 20th century neurolinguistics has provided evidence that a lesion in Broca's area and the surrounding tissue also affects complex sentence comprehension, in particular syntactic processing (Carramazza & Zurif, 1976, for a review see Grodzinsky, 2000). Thus, since then Broca's area has been proposed as the seat of syntax in the brain for both speech production and comprehension even though Broca's aphasics have been shown to make use of syntactic knowledge (Linebarger, Schwartz, & Saffran, 1983).
To date two aspects of the traditional claim have been called into question: the first aspect considers the specificity of Broca's area with respect to syntactic processing. Secondly, the left hemisphere's preponderance for language functions is still a matter of debate since recent neuroimaging studies have shown that speech processing also involves areas in the right hemisphere (Binder et al., 2000, Burton, 2001, Burton et al., 2000, Caplan and Dapretto, 2001, Hickok and Pöppel, 2000, Meyer et al., 2000, Vouloumanos et al., 2001). With respect to the former aspect, lesion studies investigating patients who display symptoms of agrammatism suggested that Broca's area is the locus of syntactic processing. The widely held belief that Broca's area is specifically associated with syntactic processing has been supported by recent neuroimaging studies (Caplan et al., 1998, Caplan et al., 1999, Caplan et al., 2000, Dapretto and Bookheimer, 1999, Embick et al., 2000, Inui et al., 1998, Kang et al., 1999, Ni et al., 2000). This assertion, however, has been called into question by other recent neuroimaging investigations suggesting that Broca's area is involved in a wide range of cognitive and perceptuomotor functions generally subserving language as well as nonlinguistic processes, for example, the analyses of lexical–semantic information (Poldrack et al.,1999), harmonic sequences (Maess, Kölsch, Gunter, & Friederici, 2001), slow tonal frequency glides (Müller, Kleinhans, & Courchesne, 2001), visually prompted digit sequence learning (Müller, Kleinhans, Pierce, Kemmotsu, & Courchesne, 2002), perception of the rhythm of motion (Schubotz & von Cramon, 2001), imagery of motion (Binkofski et al., 2000), and to segmentation processes in speech perception (Burton et al., 2000). Furthermore, a review article on phonological studies by Pöppel provides evidence for an essential contribution of Broca's area to verbal working memory (Pöppel, 1996).
Taken together, these findings are inconsistent with an exclusively syntactic specialization of Broca's area. Moreover, those studies that assessed syntactic functions to Broca's area either confounded syntactic complexity and verbal working memory demands, or syntactic and semantic processing under the same task instructions (Bavelier et al., 1997, Caplan et al., 1998, Dapretto and Bookheimer, 1999). A recent fMRI study investigating syntactic complexity and working memory demands separately demonstrated that activation in Broca's area reflects syntactic working memory requirements rather than syntactic complexity (Fiebach, Schlesewsky, & Friederici, 2001). Thus, it seems that the functional role of Broca's area cannot be claimed to be uniquely syntactic.
An additional challenge for the view that the language relevant areas are exclusively located in the left hemisphere comes from studies that investigated sentence-level syntactic processing (Just et al., 1996, Keller et al., 2001, Meyer et al., 2000, Müller et al., 1997, Newman et al., 2001, Sakai et al., 2001, Vandenberghe et al., 2002). These studies reported bilateral peri-sylvian regions to be involved whenever a resting period or auditory non-speech stimuli served as baselines. A substantial involvement of the right hemisphere was observed whenever sentences were presented auditorily (Mazoyer et al., 1993, Meyer et al., 2000, Müller et al., 1997) suggesting that the comprehension of auditorily presented normal sentences, in particular, recruits bilateral areas in the peri-sylvian region. However, the investigation of normal sentence comprehension (in comparison to such baselines) does not necessarily expose brain areas specifically involved in syntactic computation because in normal sentences syntactic cues are confounded with lexical–semantic information. The present article aims to address some of the aforementioned issues. A first experiment was designed to identify brain areas specifically assigned to syntactic processing. To avoid any confound between the presence of lexical–semantic and syntactic cues we investigated the processing of pseudo speech which consists of grammatically correct sentences in which nouns, adjectives and verbs are replaced by pseudowords. By presenting pseudo speech we expected to activate the neural network subserving syntactic functions to a greater extent, because pseudo speech is thought to emphasize syntactic operations, i.e. building a sentence structure and dealing with morphosyntactic information. The second experiment was performed to identify the particular contribution of the right hemisphere in speech comprehension. To achieve this goal we constructed stimulus material which was free of lexical–semantic and syntactic information but preserved the prosodic cues (intonation contour) of normal sentences.
In this experiment, we compared (semantically) normal sentences and (semantically meaningless) pseudo sentences within one experimental paradigm.4
Section snippets
Materials and methods
Stimuli and design. All sentences were recorded by a trained female speaker in a soundproof room (IAC) and digitized at a 16 bit/41.1 kHz sampling rate. The mean length of the sentences in the normal speech condition was 3.4±SD 0.36 s, and in the syntactic speech condition 3.6±SD 0.35 s. The mean sound pressure level for the normal speech condition was 20 dB (SPL) and for the pseudo speech condition 21 dB (SPL).
Normal speech—Grammatically, semantically and phonologically correct sentences.
Die
Results
Performance data. The judgment performance revealed approximately perfect correctness of 96.61% (SE=1.26) for the normal speech, and 89.58% (SE=4.94) for the pseudo speech condition.
fMRI Data. For sentence processing, strong brain activation was observed in superior temporal regions (STR) for both normal and pseudo speech (see Fig. 1A and B).
This activation included the anterior STR (planum polare), the middle STR (Heschl's gyrus),6
Discussion
The results suggest that the cerebral network subserving auditory sentence comprehension involves temporal areas predominantly in the left, but also in the right hemisphere. Consistent with recent claims (Humphries et al., 2001, Mazoyer et al., 1993, Meyer et al., 2000, Müller et al., 1997, Schlosser et al., 1998, Stowe et al., 1998), sentences produced a large region of activation in STRs bilaterally with the anterior part (planum polare) playing a special role. This finding suggests that the
Materials and methods
Stimuli and design. To generate degraded speech, normal speech files were treated by applying the PURR-filtering procedure (Sonntag & Portele, 1998). The segmental content, i.e. the spectral qualities of the speech signal were completely removed from the signal. In more detail, pitch marks were extracted from the original signal and then dynamically low-pass filtered with a cut-off frequency up to the third harmonic. The cut-off frequency of the unvoiced segments, i.e. of the aperiodical parts
Results
Performance data. Passive/active judgment performance was almost perfect for the normal speech (93.35%, SE=2.77) and for the pseudo speech (91.76%, SE=2.90) conditions. Degraded speech produced responses at chance level (51.78%, SE=1.92), clearly demonstrating that pure speech melody cannot be syntactically or lexically decoded.
fMRI data. Analyses of inter-subject averaged responses revealed that hearing normal, pseudo and degraded speech consistently activated cortical segments along the
Discussion
Activation for pseudo speech relative to normal speech was greater bilaterally in the STR. This finding consistently replicates the activation pattern found in Experiment 1. In terms of pseudo speech all content words were replaced by phonotactically legal pseudowords. Thus, the increase in superior temporal activation may be taken to reflect additional processing when unknown pseudowords are heard and subjects failed to find an equivalent lexical entry. This explanation is also consistent with
Conclusion
The data obtained from the two experiments together suggests that normal auditory sentence comprehension occurs automatically, recruiting the STR bilaterally. As hypothesized, no significant activation was found in Broca's area for any sentence condition. This finding indirectly provides evidence for the view that Broca's area is attributed to domain-general rather than to specific linguistic processes (Kaan & Swaab, 2002; Meyer et al., 2000, Müller et al., 2001, Stowe et al., 2003). However,
Acknowledgements
The authors wish to thank Alice Turk and Adam McNamara for helpful comments on the manuscript. The work was supported by the Leibniz Science Prize awarded to Angela Friederici.
References (101)
Characterizing sentence intonation in a right-hemisphere damaged population
Brain and Language
(1989)- et al.
Human temporal-lobe response to vocal sounds
Cognitive Brain Research
(2002) The role of inferior frontal cortex in phonological processing
Cognitive Science
(2001)- et al.
PET studies of syntactic processing with auditory sentence presentation
NeuroImage
(1999) - et al.
Form and content: dissociating syntax and semantics in sentence comprehension
Neuron
(1999) Towards a neural basis of auditory sentence processing
Trends in Cognitive Science
(2002)- et al.
Auditory language comprehension: an event-related fMRI study on the processing of syntactic and lexical information
Brain and Language
(2000) - et al.
The trouble with cognitive subtraction
NeuroImage
(1996) - et al.
Towards a functional neuroanatomy of speech perception
Trends in Cognitive Science
(2000) - et al.
Syntactic processing in left prefrontal cortex is independent of lexical meaning
NeuroImage
(2001)
Phonetic perception and the temporal cortex
NeuroImage
The brain circuitry of syntactic comprehension
Trends in Cognitive Science
An event-related fMRI study of implicit phrase-level syntactic and semantic processing
NeuroImage
The effect of spectral manipulation on the identification of affective and linguistic prosody
Brain and Language
Lipsia: a new software system for the evaluation of functional magnetic resonance images of the human brain
Computerized Medical Imaging and Graphics
Neurocognition of auditory sentence comprehension: event-related fMRI reveals sensitivity to syntactic violations and task demands
Cognitive Brain Research
Human primary auditory cortex: cytoarchitectonic subdivisions and mapping into a spatial reference system
NeuroImage
Broca's area and the discrimination of frequency transitions: a functional MRI study
Brain and Language
of motor sequence acquisition: effects of learning stage and performance
Cognitive Brain Research
Cortical activation with sound stimulation in cochlear implant users demonstrated by positron emission tomography
Cognitive Brain Research
Functional specialization for semantic and phonological processing in the left inferior prefrontal cortex
NeuroImage
Hemispheric lateralization effects of rhythm implementation during syllable repetitions: an fMRI study
NeuroImage
Sentence processing in the cerebral cortex
Neuroscience Research
Lateralization of prosody during language production: a lesion study
Brain and Language
PURR: a method for prosody evaluation and investigation
Journal of Computer Speech and Language
Localization of syntactic comprehension by positron emission tomography
Brain and Language
Brain regions involved in articulation
The Lancet
The interpretation of sentence ambiguity in patients with unilateral focal brain surgery
Brain and Language
A trial-based experimental design for fMRI
NeuroImage
Sentence reading: a functional MRI study at 4 Tesla
Journal of Cognitive Neuroscience
Human temporal lobe activation by speech and nonspeech sounds
Cerebral Cortex
Function of the left planum temporale in auditory and linguistic processing
Brain
Functional magnetic resonance imaging of human auditory cortex
Annals of Neurology
Broca's region subserves imagery of motion: a combined cytoarchitectonic and fMRI-study
Human Brain Mapping
Hemispheric processing of intonation contours
Cortex
Wernicke's region—Where is it?
Annals of the New York Academy of Sciences
Statistical analysis of multi-subject fMRI data: the assessment of focal activations
Journal of Magnetic Resonance Imaging
The role of right hemisphere in interpretation of figurative aspects of language. A positron emission tomography activation study
Brain
Disturbances of speech prosody following right hemisphere infarcts
Acta Neurologica Scandinavia
Language prosody in the right hemisphere
Aphasiology
The role of segmentation in phonological processing: an fMRI investigation
Journal of Cognitive Neuroscience
Effects of syntactic structure and propositional number on patterns of regional cerebral blood flow
Journal of Cognitive Neuroscience
Activation of Broca's area by syntactic processing under conditions of concurrent articulation
Human Brain Mapping
Making sense during conversation: an fMRI study
NeuroReport
Location of lesions in stroke patients with deficits in syntactic processing in sentence comprehension
Brain
Functional MRI of language processing: dependence on input modality and temporal lobe epilepsy
Epilepsia
Dissociation of algorithmic and heuristic processes in language comprehension: evidence from aphasia
Brain and Language
Neural basis for sentence comprehension: grammatical and short-term memory components
Human Brain Mapping
The linguistic basis of left hemisphere specialization
Science
Event-related functional MRI: implications for cognitive psychology
Psychological Bulletin
Cited by (32)
Differential tinnitus-related neuroplastic alterations of cortical thickness and surface area
2016, Hearing ResearchCitation Excerpt :However, the subcortical nuclei, namely the thalamus and the amygdala, have also already been located roles in tinnitus-related networks (De Ridder et al., 2014). Regarding the thalamus, it is additionally worth mentioning that it is part of a triangle, consisting of auditory cortex, insula and thalamic nuclei, that supports audition in general (Meyer et al., 2003). Our reasoning may help to capture innovative aspects of the causal interplay between observed neuroanatomical changes and behavioral patterns in a large sample of TI.
The role of the insula in speech and language processing
2014, Brain and LanguageCitation Excerpt :More recently, an fMRI study investigating neural responses to sentences varying in pitch information (normal intonation versus synthesized flattened speech) and syntactic information (normal speech versus synthesized, de-lexicalized speech) found right fronto-lateral activation in response to prosodic cues. The right-superior temporal region and fronto-opercular cortex were identified as being more involved in processing prosodic information (Meyer, Alter, & Friederici, 2003). These findings suggest that the right hemisphere frontal lobe support prosodic comprehension.
Fronto-temporal mapping and connectivity using NIRS for language-related paradigms
2013, Journal of NeurolinguisticsCitation Excerpt :The technology has demonstrated its potential as a useful tool in measuring brain activity by various research groups (only review articles cited for conciseness) (Strangman et al., 2002; Wolf et al., 2007). Since its demonstration as a useful tool in monitoring brain activation, it has been compared and correlated to the Blood Oxygenation Level Dependent (BOLD) signal of fMRI (Brauer, Newman, and Frederici, 2008; Gartus, Foki, Geissler, & Beisteiner, 2009; Mehagnoul-Schipper et al., 2002; Meyer, Alter, & Frederici, 2003). NIRS has also been applied to study language from temporal and frontal cortical activity (Cannestra, Wartenburger, Obrig, Villringer, & Toga, 2003; Chaudhary, Hall, DeCerce, Rey, & Godavarty, 2011; Ehlis, Herrmann, Plichta, & Fallgatter, 2007; Ehlis, Herrmann, Wagener, & Fallgatter, 2005; Gallagher et al., 2008; Herrmann, Ehlis, & Fallgatter, 2003; Jayakar, Dunoyer, Rey, Yaylali, & Jayakar, 2005; Kennan, Kim, Maki, Koizumi, & Todd, 2002; Kubota et al., 2005; Kuwabara et al., 2006; Noguchi, Takeuchi, & Sakai, 2002; Paquette et al., 2010; Quaresima, Ferrari, Van Der Sluiis, Menssen, & Colier, 2002; Sakatani, Xie, Lichty, Li, & Zuo, 1998; Sato, Takeuchi, & Sakai, 1999; Schecklmann, Ehlis, Plichta, & Fallgatter, 2008; Watanabe et al., 1998; Watson, Dodrill, Farrell, Holmes, & Miller, 2004; Zhang et al., 2010).
Multi- and unisensory decoding of words and nonwords result in differential brain responses in dyslexic and nondyslexic adults
2011, Brain and LanguageCitation Excerpt :According to our results (see Table 3), pure auditory stimuli evoke the strongest hemodynamic response in the right anterior insula, followed by the visual and audiovisual stimuli. Previous reports concur our findings that the right insula participates in several auditory processes that include: various aspects of auditory temporal processing (Bushara et al., 2001), sequencing sounds, musical rhythm processing, detection of moving sounds (Lewis et al., 2000), and speech melody (Meyer, Alter, & Friederici, 2003). In addition, our regression analysis (see Table 4) yielded a positive correlation between hemodynamic response in the rightward insula and spelling skills, which is due to the aural presentation of words and pseudowords.
Role of the Corpus Callosum in Speech Comprehension: Interfacing Syntax and Prosody
2007, NeuronCitation Excerpt :Thus, the transfer between the LH and the RH with respect to the integration of prosodic and syntactic information may not be restricted to the splenium itself, whose superior region, in particular, has been described neuroanatomically as the location of the fibers connecting the temporal lobes (e.g., Huang et al., 2005). As the processing of prosodic and syntactic information involves the most posterior parts of the left and right temporal lobe, respectively (e.g., Perkins et al., 1996; Meyer et al., 2003), it is likely that the interhemispheric transfer has a broader basis including the presplenial part of the CC. A study examining the degeneration in the CC as a consequence of temporal lesions supports this view by finding that the lesions in the posterior temporal lobe led to degeneration in the splenium and in the posterior trunk of the CC (De Lacoste et al., 1985).