Skip to main content

CHAD: A Chinese Affective Database

  • Conference paper
Affective Computing and Intelligent Interaction (ACII 2005)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 3784))

Abstract

Affective database plays an important role in the process of affective computing which has been an attractive field of AI research. Based on analyzing current databases, a Chinese affective database (CHAD) is designed and established for seven emotion states: neutral, happy, sad, fear, angry, surprise and disgust. Instead of choosing the personal suggestion method, audiovisual materials are collected in four ways including three types of laboratory recording and movies. Broadcast programmes are also included as source of vocal corpus. By comparison of the five sources two points are gained. First, although broadcast programmes get the best performance in listening experiment, there are still problems as copyright, lacking visual information and can not represent the characteristics of speech in daily life. Second, laboratory recording using sentences with appropriately emotional content is an outstanding source of materials which has a comparable performance with broadcasts.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Darwin, C.: The expression of the emotions in man and animals. University of Chicago Press, Chicago (1965) (Original work published in 1872)

    Google Scholar 

  2. Picard, R.: Affective computing. The MIT Press, Cambridge (1997)

    Google Scholar 

  3. Dellaert, F., Polzin, T., Waibel, A.: Recognizing emotions in speech. In: ICSLP (1996)

    Google Scholar 

  4. Murray, I.R., Arnott, J.L.: Toward the simulation of emotion in synthetic speech: A review of the literature on human vocal emotions. Journal Acoustical society of America 2, 1097–1108 (1993)

    Article  Google Scholar 

  5. Cowie, R., et al.: Emotion recognition in Human-Computer Interaction. IEEE Signal Processing Magazine 18, 32–80 (2001)

    Article  Google Scholar 

  6. Roach, P., Stibbard, R., Osborne, J., Arnfield, S., Setter, J.: Transcriptions of prosodic and paralinguistic features of emotional speech. J. Int. Phonetic Assoc. 2, 83–94

    Google Scholar 

  7. http://cvc.yale.edu/projects/yalefaces/yalefaces.htm

  8. Emanuela, M.C., Piero, C., Carlo, D., Graziano, T., Federica, C.: Modifications of phonetic labial targets in emotive speech: effects of the co-production of speech and emotions. Speech Communication 44, 173–185 (2004)

    Article  Google Scholar 

  9. Douglas-Cowie, E., Cowie, R., Schroder, M.: A new emotion database Considerations, sources and scope. In: ISCA Workshop on Speech and Emotion: A conceptual framework for research (2000)

    Google Scholar 

  10. Albino, N., Asuncion, M., Antonio, B., Jose, B.: Marino Speech Emotion Recognition Using Hidden Markov Models. In: Eurospeech (2001)

    Google Scholar 

  11. Noam, A., Samuel, R., Nathaniel, L.: Analysis of an emotional speech corpus in Hebrew based on objective criteria. In: ISCA Workshop on Speech and Emotion: A conceptual framework for research (2000)

    Google Scholar 

  12. http://www.unige.ch/fapse/emotion/

  13. Chul Min, L., Shrikanth, S.N., Roberto, P.: Combining Acoustic and Language Information for Emotion Recognotion. In: ICSLP (2002)

    Google Scholar 

  14. McGilloway, S., et al.: Approaching Automatic Recognition of Emotion from Voice: A Rough Benchmark. In: ISCA Workshop on Speech and Emotion: A conceptual framework for research (2000)

    Google Scholar 

  15. Ilkka, L., Lea, L., Minna, V., Maija-Liisa, L., Synnove, C.: Conveyance of emotional connotations by a single word in English. Speech Communication 45, 27–39 (2005)

    Article  Google Scholar 

  16. Veronika, M., Valery, A.P.: RUSLANA: a Database of Russian Emotional Utterances. In: ICSLP, pp. 2041–2044 (2002)

    Google Scholar 

  17. Li, Z., Xiangmin, Q., Cairong, Z.H., Zhenyang, W.: A Study on Emotion Recognition in Speech. Journal of Software 12 (2001)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

You, M., Chen, C., Bu, J. (2005). CHAD: A Chinese Affective Database. In: Tao, J., Tan, T., Picard, R.W. (eds) Affective Computing and Intelligent Interaction. ACII 2005. Lecture Notes in Computer Science, vol 3784. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11573548_70

Download citation

  • DOI: https://doi.org/10.1007/11573548_70

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-29621-8

  • Online ISBN: 978-3-540-32273-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics