Amygdala activation at 3T in response to human and avatar facial expressions of emotions

https://doi.org/10.1016/j.jneumeth.2006.10.016Get rights and content

Abstract

Facial expressions of emotions are important in nonverbal communication. Although numerous neural structures have been identified to be involved in emotional face processing, the amygdala is thought to be a core moderator. While previous studies have relied on facial images of humans, the present study is concerned with the effect of computer-generated (avatar) emotional faces on amygdala activation. Moreover, elicited activation patterns in response to viewing avatar faces are compared with the neuronal responses to human facial expressions of emotions. Twelve healthy subjects (five females) performed facial emotion recognition tasks with optimized 3T event-related fMRI. Robust amygdala activation was apparent in response to both human and avatar emotional faces, but the response was significantly stronger to human faces in face-sensitive structures, i.e. fusiform gyri. We suggest that avatars could be a useful tool in neuroimaging studies of facial expression processing because they elicit amygdala activation similarly to human faces, yet have the advantage of being highly manipulable and fully controllable. However, the finding of differences between human and avatar faces in face-sensitive regions indicates the presence of mechanisms by which human brains can differentiate between them. This mechanism merits further investigation.

Introduction

Recognizing the emotional expression in a face is a paramount social-cognitive competency of humans, which relies on a widespread neuronal network that involves cortical and subcortical-limbic structures (Adolphs, 2002). Studies have demonstrated that the amygdala is an important moderator of emotional processing, especially for the perception and recognition of emotional faces (see for review Calder et al., 2001). Several neuroimaging studies have highlighted amygdala activation in response to viewing fearful facial expressions (Adolphs et al., 1995, Irwin et al., 1996, Morris et al., 1996, Morris et al., 1998, Broks et al., 1998, Sprengelmeyer et al., 1999, Phillips et al., 2001) and even subliminal presentation of fearful expressions have been found to cause strong activation of the amygdala (Whalen et al., 1998, Morris et al., 1999). Showing fearful eyes alone also resulted in increased amygdala activation (Morris et al., 2002). This supports the view of an implicit and coarse appraisal role for the amygdala in the assessment of the valence of stimuli (threatening or pleasant) (LeDoux, 2000, Critchley et al., 2000) in particular when novel stimuli are presented (Schwartz et al., 2003). However, the amygdala's role in processing facial emotions other than fear is not as well documented, although recent data on amygdala activation in response to emotional and neutral facial expressions further indicate involvement of the amygdala in processing across emotions (Fitzgerald et al., 2006, Yang et al., 2002) based on its evaluation function.

Most studies on the perception of facial emotional expression have used two-dimensional (2D) photographs of real people. However, faces carry a lot more information than the emotion of the subject (e.g. facial shape and texture cues), which might perturb brain activity (Fink and Penton-Voak, 2002, Kranz and Ishai, 2006, Johnston, 2006 on facial appearance). In addition, the intensity of the facial expression of “real” people cannot be systematically controlled and varied. Moreover, photographing, lighting conditions and nonverbal signs, such as head tilts or head turns are possible confounds. Sato et al. (2002) used facial morphing techniques for the study of facial expression recognition in a bilateral amygdala-damaged patient and created facial expressions that blended happiness and fear, happiness and anger, or happiness and sadness for a subsequent categorization task. Facial morphing is a recognized method for the study of facial appearance and perception but it could distort reality and introduce artefacts, such as texture-smoothing from digital blending.

A possible means to circumvent these problems would be the use of three-dimensional (3D) virtual human face models (avatars). Avatars can be animated and deformed according to the experimenter's needs, yet still represent relevant action units (AUs; Ekman and Friesen, 1969, Ekman and Friesen, 1978) in the face. An avatar, derived from the Sanskrit where it was used to denote the bodily manifestation of an immortal being, refers in computing to the icon representing a user in internet forums or games. The term has also been adopted for computer-generated representations of humans (without being linked to a particular person), as applied here. Avatars allow the generation of stimuli with well defined facial appearance, for example faces systematically varying in age, sex and ethnicity. Photorealistic 3D models can be created by rendering under controlled lighting conditions, camera angle and focus length.

Useful applications of virtual characters have been investigated for several domains of human interaction (e.g. Morris et al., 2005) and communication (e.g. De Leo et al., 2003, Grammer and Oberzaucher, 2006). Bailenson et al. (2003) demonstrated that humans develop an experience of being with another person when confronted with avatars. Bailenson et al. (in press) showed that people were prepared to share most personal information with avatars that were low in realism and high in behavioral similarity. According to Ku et al. (2005), avatars have the potential to influence their subject's emotions dependent on gender of the avatar and intensity of expression. While some brain imaging studies have used morphed faces (e.g. Gur et al., 1994, Sato et al., 2002) or virtual environments (e.g. Morris et al., 2005) there is, to our knowledge, little known about the effect of avatar faces on brain response.

In the present study, we investigated neuronal activation, in particular amygdala activation, in response to expressions of emotions of real and virtual faces. Our aim was to determine whether facial expressions of avatars are able to cause amygdala activation similar to facial images of real people. We hypothesized that avatar facial expressions would elicit a similar amygdala response to images of human faces. We also hypothesized stronger activation of brain areas known to play a role in face processing to human stimuli, in particular elevated response of the fusiform gyrus and temporal regions.

Section snippets

Sample

Twelve healthy, right-handed, participants (five females) from the ages of 23–35 years (mean age = 27.67, S.D. = 4.42) were recruited by advertisements at the University of Vienna and the Medical University of Vienna, Vienna, Austria. All subjects were paid €35 for their participation. Written informed consent was obtained after a complete description of the study. The experiment was approved by the local ethics committee. The presence of psychiatric disorders (according to DSM-IV) was ruled out

Behavioral data

For the emotion recognition performance repeated measures ANOVA revealed a significant effect for emotion (F(5,6) = 6.458, p = 0.002), with highest recognition of sad faces and lowest performance for expressions of disgust, and a significant stimulus type (human versus avatar) effect (F(1,10) = 25.885, p < 0.001) indicating better performance for human faces. Furthermore, significant emotion × stimulus type (F(5,6) = 4.270, p = 0.011) and stimulus type × gender interactions (F(1,10) = 6.234, p = 0.032) were

Discussion

This functional MRI study investigated the effect of real and artificial emotional faces (avatars) on amygdala activation in healthy human subjects. The behavioral data indicated that all emotional stimuli, both human and avatar, were processed and the emotions depicted recognized above chance. We observed robust, bilateral amygdala activation in response to both types of stimuli. Direct comparison revealed more pronounced activation for real faces in face-sensitive areas, e.g. fusiform gyri

Acknowledgement

This study was supported by Austrian Science Fund (P16669-B02 to E.M.) and grant MH60722 to R.C.G.

References (48)

  • R.C. Oldfield

    The assessment and analysis of handedness: the Edinburgh inventory

    Neuropsychologia

    (1971)
  • S. Robinson et al.

    Optimized 3T EPI of the amygdalae

    NeuroImage

    (2004)
  • W. Sato et al.

    Seeing happy emotion in fearful and angry faces: qualitative analysis of facial expression recognition in a bilateral amygdala-damaged patient

    Cortex

    (2002)
  • W. Sato et al.

    Enhanced neural activity in response to dynamic facial expressions of emotion: an fMRI study

    Cogn Brain Res

    (2004)
  • L. Schilbach et al.

    Being with virtual others: neural correlats of social interaction

    Neuropsychologia

    (2006)
  • C.E. Schwartz et al.

    Differential amygdalar response to novel versus newly familiar neutral faces: a functional MRI probe developed for studying inhibited temperament

    Biol Psychiatry

    (2003)
  • C. Windischberger et al.

    On the origin of respiratory artifacts in BOLD-EPI of the human brain

    Magn Reson Imaging

    (2002)
  • R. Adolphs et al.

    Fear and the human amygdala

    J Neurosci

    (1995)
  • Bailenson JN, Yee N, Merget D, Schröder R. The effect of behavioural realism and form realism of real-time avatar faces...
  • J.N. Bailenson et al.

    Interpersonal distance in immersive virtual environments

    Pers Soc Psychol Bull

    (2003)
  • L. Cahill

    Why sex matters for neuroscience

    Nat Rev Neurosci

    (2006)
  • A.J. Calder et al.

    Neuropsychology of fear and loathing

    Nat Rev Neurosci

    (2001)
  • H.D. Critchley et al.

    Explicit and implicit neural mechanisms for processing of social information from facial expressions: a functional magnetic imaging study

    Hum Brain Mapp

    (2000)
  • G. De Leo et al.

    A virtual reality system for the training of volunteers involved in health emergency situations

    Cyberpsychol Behav

    (2003)
  • Cited by (105)

    • The role of sex and emotion on emotion perception in artificial faces: An ERP study

      2022, Brain and Cognition
      Citation Excerpt :

      In addition, our data supplement previous knowledge with the evidence that this effect is still present even in computer-generated faces and the most visible is in the recognition of the subtle expression of sadness. Contrary to our results, Moser et al. (Moser et al., 2007) did not find a significant sex effect (F(1,10) = 2.984, p = 0.115) in emotion recognition study with virtual faces. However, the sample size in their study was relatively small (12 participants).

    • Intimacy perception: Does the artificial or human nature of the interlocutor matter ?

      2020, International Journal of Human Computer Studies
      Citation Excerpt :

      However, the authors suggested that the perception of the avatar’s pain induces personal distress rather than an empathic response. Other studies also reported brain activation in emotional face-processing areas in response to human and artificial emotional facial expressions (Moser et al., 2007; Mühlberger et al., 2009). Taken together, we posit that the aforementioned evidence suggests that expression of emotions in virtual characters can be perceived similarly to human emotions, with corresponding behavioral and physiological activation (de Borst and de Gelder, 2015).

    • Amygdala responds to direct gaze in real but not in computer-generated faces

      2020, NeuroImage
      Citation Excerpt :

      We note some limitations to the present study. First, whereas Moser et al. (2007) used CG faces that had lower emotional intensity than real faces, in our study anger and fear were actually recognized better and anger was rated more intense when posed by CG faces. Even though these differences were relatively small (e.g., for intensity ratings, at most 14 points difference [upper 95% CL] on the 100-step scale), it is nevertheless possible that they might have compensated for otherwise weaker emotional responses to the CG faces.

    View all citing articles on Scopus
    View full text