Judgements of emotion in words and faces: ERP correlates

https://doi.org/10.1016/0167-8760(87)90006-7Get rights and content

Abstract

Visual event-related potentials (ERPs) to two types of stimuli (faces and words) were analyzed to determine the effects of the perceived emotional connotations of the stimuli (positive, neutral, or negative) in 10 right-handed normal functioning adult males. Principal component analysis (PCA) of the ERPs reveales 5 factors accounting for over 90% of the ERP waveform variance for both faces and words. In the facial data, two ERP components varied in amplitude according to the perceived emotional connotation of the stimulus. For the P3 component, neutrally rated stimuli produced significantly larger amplitudes than stimuli rated as positive or negative. This effect was lateralized to the left hemisphere. A later positive component, the slow wave (448–616 ms), manifested complementary effects, i.e. faces perceived as positive and negative produced larger amplitudes than those perceived as neutral over the right hemisphere. The verbal stimuli did not result in significant main effects for perceived emotional connotation, but produced subtle connotation-related differences in slow wave topography. Hemispheric asymmetries, unrelated to affective connotation, were evident in the verbal data, manifesting different patterns of lateralization depending on the ERP component. The results suggest that differential processing of emotional connotation affects ERP waveforms and that the effects can be understood in terms of ERP components known to be associated with more general aspects of cognitive processing.

References (32)

  • R.M. Chapman

    Connotative meaning and averaged evoked potentials

  • R.M. Chapman et al.

    Semantic meaning of words and averaged evoked potentials

  • E. DeRenzi et al.

    Visual recognition in patients with unilateral cerebral disease

    J Nervous Mental Dis

    (1966)
  • (1975)
  • E. Donchin

    Event-related potentials: a tool in the study of human information processing

  • C.B. Duncan-Johnson et al.

    On quantifying surprise: the variation in event-related potentials with subjective probability

    Psychophysiology

    (1977)
  • Cited by (119)

    • Separate neural networks of implicit emotional processing between pictures and words: A coordinate-based meta-analysis of brain imaging studies

      2021, Neuroscience and Biobehavioral Reviews
      Citation Excerpt :

      Pictures may allow access to emotional information more automatically than words, as the latter have to go through phonological processing first (Brosch et al., 2010). Researchers also point out that emotional pictures may be more biologically relevant and therefore are more physiologically arousing than words (Keil, 2006; Vanderploeg et al., 1987), whereas words may lead to stronger top-down effects on emotional processing (Carretie et al., 2008). Moreover, pictures may have privileged access to the emotional systems compared to words, as indicated by the fact that the emotionality of a picture interferes with affective categorization of words, but the reverse is not true (Houwer and Hermans, 1994).

    • Implicit predictions of future rewards and their electrophysiological correlates

      2017, Behavioural Brain Research
      Citation Excerpt :

      For example, work employing fMRI [23,24] has found that superficially neutral cues that predict negative, emotional images engage regions such as the anterior cingulate cortex, ventrolateral prefrontal cortex, and amygdala. Similar studies have been conducted using electrophysiological measures, but these studies have largely used predictive cues that are themselves affective [25,26]. Emotional content such as facial expressions and emotional words are known to cause affect-related neural responses [27,28] which renders these past studies somewhat ambiguous.

    • Emotional state classification from EEG data using machine learning approach

      2014, Neurocomputing
      Citation Excerpt :

      Besides the research of asymmetrical activation of the cerebral hemisphere, event-related potentials were also been used to study the association with EEG and emotion. Vanderploeg et al. [38] reported that the presentations of the photos with emotional facial expressions elicited more negative amplitudes during 230 ±420 ms (with peak at about 340 ms) after the stimulus onset than did neutrally rated stimuli. Wataru et al. [39] reported that the visual presentations of the faces with emotion (both fear and happiness) elicited a larger N270 over the posterior temporal areas, covering a broad range of posterior visual areas.

    View all citing articles on Scopus
    View full text