The development of the perception of facial emotional change examined using ERPs

https://doi.org/10.1016/j.clinph.2010.07.013Get rights and content

Abstract

Objective

The development of the perception of changes in facial emotion was investigated using event-related potentials (ERPs) in children and adults.

Methods

Four different conditions were presented: (1) NH: a neutral face that suddenly changed to a happy face. (2) HN: reverse of NH. (3) NA: a neutral face that suddenly changed to an angry face. (4) AN: reverse of NA.

Results

In the bilateral posterior temporal areas, a negative component was evoked by all conditions in younger children (7–10 years old), older children (11–14 years old), and adults (23–33 years old) within 150–300 ms. Peak latency was significantly shorter and amplitude was significantly smaller in adults than younger and older children. Moreover, maximum amplitude was significantly larger for NH and NA than HN and AN in younger children and for NH than the other three conditions in adults.

Conclusion

The areas of the brain involved in perceiving changes in facial emotion have not matured by 14 years of age.

Significance

Our study is the first to clarify a difference between children and adults in the perception of facial emotional change.

Introduction

In daily life, much information regarding familiarity, sex, age, verbal gestures, and emotion is obtained from peoples’ faces (Bruce and Young, 1986). Information on emotion is particularly important for social interaction and is obtained from facial expressions.

Neuroimaging has been used to study the perception of facial emotion. In studies of event-related potential (ERP) averaged by electroencephalography (EEG), a negative component was evoked by static faces at approximately 170 ms, and termed the N170. The N170 was significantly earlier and larger for faces than other objects (e.g., cars), and might be specific to static faces (e.g., Rossion and Jacques, 2008). In addition, some studies found that it was not changed by emotion (e.g., Eimer and Holmes, 2002, Ashley et al., 2004, Holmes et al., 2005) while others showed that the N170 evoked by positive emotions occurred significantly earlier than that evoked by negative emotions, and the N170 evoked by fearful faces had a larger amplitude than that evoked by neutral or surprised faces (Batty and Taylor, 2003). In functional magnetic resonance imaging (fMRI), several cortical areas, for example, the fusiform area and the superior temporal sulcus (STS), were more activated by static faces than other objects (e.g., Kanwisher et al., 1997, Kanwisher et al., 1999, Yovel and Kanwisher, 2005). The response in the fusiform area was greater for neutral faces than for facial expressions in the control group, being different from the developmental prosopagnosia (DP) group (Van den Stock et al., 2008). However, attention to facial emotion specifically enhanced the activity in the right STS compared with attention to the face per se (Narumoto et al., 2001).

As changes in facial expression are experienced in everyday life, and more important to social communication than static expressions, there have been several studies on facial emotional change. In a psychological study, Matsuzaki and Sato (2008) examined the contribution of motion-related information to the perception of facial expressions using point-light displays of faces and two conditions: a motion condition in which apparent motion was induced by replacing the image of a neutral facial expression with one of an emotional expression, and a repetition condition in which the same image of an emotional facial expression was presented twice. The results showed a great contribution of the motion-related information to the perception of facial expression. In an EEG study, ERPs were elicited by the successive presentation of two different faces without an interval in between: a smiling face was preceded by either a neutral facial expression of the same person, the smiling face of a different person, or a neutral facial expression of a different person, which generated expressional change, individual change, or both expressional and individual changes. The ERP evoked by expressional change was larger than that evoked by individual change or both changes in the bilateral posterior temporal areas (Miyoshi et al., 2004). An fMRI study showed that broad areas in the occipital and temporal cortices (inferior temporal occipital gyri, middle temporal gyri, fusiform gyri) were activated on the viewing of changes in facial expression rather than a change of mosaic (control) images, especially in the right hemisphere (Sato et al., 2004).

There have been many studies on the development of face perception, especially static face perception, using neuroimaging methods. With EEG, N170 was found to change with age (e.g., de Haan et al., 2002 on the pN170 in babies; Itier and Taylor, 2004a, Itier and Taylor, 2004b, Taylor et al., 1999, Taylor et al., 2004 which is a review of that group’s work). In a study with infants (de Haan et al., 2002), a putative ‘‘infant N170” showed sensitivity to the species of animal to which the face belonged, but the orientation of the face did not influence processing until a later stage, being different from adults. Other ERP studies showed that in terms of distribution, latency, and amplitude, the adult’s pattern of the N170 was not reached even by the mid-teens (Itier and Taylor, 2004a, Itier and Taylor, 2004b, Taylor et al., 1999, Taylor et al., 2004).

There have been numerous studies on the development of emotional perception using static faces. Pollak and Kistler (2002) found that control children tended to underidentify anger, different from abused children. There have been several studies on facial emotion using ERPs and static faces (e.g., Batty and Taylor, 2006, Dennis et al., 2009, Striano and Reid, 2006 in 4-month-old infants; de Haan and Nelson, 1999). Batty and Taylor (2006) showed that the N170’s sensitivity to emotions seen in adults appeared late, only in 14- to 15-year-old. Other studies have also reported changes in the N170 for emotional processing in healthy children and children with Asperger’s syndrome (e.g., O’Connor et al., 2005).

As for other early and late components, previous ERP studies showed that they too were sensitive to facial emotion. Dennis et al. (2009) showed that the latency of the P1, beginning around 90–120 ms after the onset of a visual stimulus, was shortened whereas the amplitude of N1, the component following P1, was reduced in response to fearful as compared to sad faces. In a study about the N2, a negative component at around 200–400 ms, Lewis et al. (2007) showed that angry faces produced a larger N2 of shorter latency than happy faces.

We hypothesized that the activity in the brain reflecting the perception of changes in facial emotion alters with age, because the N170 changed with age in a previous study using static emotional expressions (Batty and Taylor, 2006). Therefore, we investigated this change in a large number of children and adults using EEG, which is able to record the brain activity of children for longer than MEG and has a higher temporal resolution than fMRI.

The main objectives of this ERP-based study were to investigate (1) when the ability to perceive facial emotional change matures and (2) how brain activity occurs in children and adults when they perceive changes in facial emotion. A total of eighty subjects, classified into three groups (7–10 years old, 11–14 years old, 23–33 years old), participated in the study. This is the first systematic and large-scale study of the development of the perception of facial emotional change using ERP.

Section snippets

Subjects

Thirty-nine younger children (23 females, 16 males, 7–10 years old (mean: 9.3 years old)), 29 older children (10 females, 19 males, 11–14 years old (mean: 12.7 years old)), and 12 adults (6 females, 6 males, 23–33 years old (mean: 28.5 years old)) with normal or corrected visual acuity were studied. They had no history of neurological or psychiatric disorders. In addition, another 19 younger children and 6 older children participated, but their data were not analyzed due to large artifacts. All

Results

An early component such as the P100, which reflected the activity of the primary visual area in previous visual ERP studies, was not identified in adults at the O1 and O2 electrodes (Fig. 2). Therefore, the early component was analyzed in younger children and older children. For peak latency, a three-way ANOVA to investigate the effect of age showed an effect of electrode but no interactions (Table 1). For maximum amplitude, a three-way ANOVA to investigate the effect of age showed effects of

The early (positive) component

As an early (positive) component was identified in each condition at about 130 ms for younger and older children, but not identified in either condition for adults, it was we speculated that: (1) the early (positive) component reflects not the perception of facial emotional change but the activity of the primary visual area, considering the results of previous ERP studies, and (2) an onset and/or offset response was evoked in the younger and older children, but not the adults. In addition,

Acknowledgements

This study was supported by a Grant-in-aid for Scientific Research on Innovative Areas (22229525), “Face perception and recognition”, from the Ministry of Education, Culture, Sports, Science and Technology to K.M.

References (35)

  • G. Yovel et al.

    The neural basis of the behavioral face-inversion effect

    Curr Biol

    (2005)
  • V. Ashley et al.

    Time course and specificity of event-related potentials to emotional expressions

    Neuroreport

    (2004)
  • M. Batty et al.

    The development of emotional face processing during childhood

    Dev Sci

    (2006)
  • A.T. Beck et al.

    An inventory for measuring depression

    Arch Gen Psychiatry

    (1961)
  • V. Bruce et al.

    Understanding face recognition

    Br J Psychol

    (1986)
  • R. Campbell et al.

    Face recognition and lipreading. A neurological dissociation

    Brain

    (1986)
  • M. de Haan et al.

    Brain activity differentiates face and object processing in 6-month-old infants

    Dev Psychol

    (1999)
  • Cited by (12)

    • The development of neural responses to emotional faces: A review of evidence from event-related potentials during early and middle childhood

      2021, Developmental Cognitive Neuroscience
      Citation Excerpt :

      Four studies explored the effect of age on N170 latency across early-to-middle childhood (see Table 2). Of these studies, Batty and Taylor (2006), and Meaux et al. (2014) reported reductions in latency with age, whilst Battaglia et al. (2007) and Miki et al. (2011) reported no age effects. Results by Miki et al. (2011), however, revealed significant decreases in N170 latency when comparing young and old children to adults.

    View all citing articles on Scopus

    No financial interests.

    View full text