Skip to main content

Neural Architecture for Temporal Emotion Classification

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 3068))

Abstract

In this pilot study, a neural architecture for temporal emotion recognition from image sequences is proposed. The investigation aims at the development of key principles in an extendable experimental framework to study human emotions. Features representing temporal facial variations were extracted within a bounding box around the face that is segregated into regions. Within each region, the optical flow is tracked over time. The dense flow field in a region is subsequently integrated whose principal components were estimated as a representative velocity of face motion. For each emotion a Fuzzy ARTMAP neural network was trained by incremental learning to classify the feature vectors resulting from the motion processing stage. Single category nodes corresponding to the expected feature representation code the respective emotion classes. The architecture was tested on the Cohn-Kanade facial expression database.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Mase, K.: Human reader: A vision-based man-machine interface. In: Cipolla, R., Pentland, A. (eds.) Computer Vision for Human-Machine Interaction, pp. 53–81. Cambridge Univ. Press, Cambridge (1998)

    Chapter  Google Scholar 

  2. Bascle, B., Blake, A., Morris, J.: Towards automated, real-time, facial animation. In: Cipolla, R., Pentland, A. (eds.) Computer Vision for Human-Machine Interaction, pp. 123–133. Cambridge Univ. Press, Cambridge (1998)

    Chapter  Google Scholar 

  3. Essa, I.A., Pentland, A.P.: Facial expression recognition using a dynamic model and motion energy. In: ICCV, pp. 360–367 (1995)

    Google Scholar 

  4. Kanade, T., Cohn, J.F., Tian, Y.: Comprehensive database for facial expression analysis. In: Proc 5th IEEE Int. Conf. on Automatic Face & Gesture Recogn., France (2000)

    Google Scholar 

  5. Lucas, B.D., Kanade, R.: An iterative image registration technique with an application to stereo vision. In: Proc. 7th Int. J. Conf. on AI, pp. 674–679 (1981)

    Google Scholar 

  6. Carpenter, G.A., Grossberg, S., Markuzon, M., Reynolds, J.H., Rosen, D.B.: Fuzzy artmap: A neural network architecture for incremental supervised learning of analog multidimensional maps. IEEE Transactions on Neural Networks 3, 698–713 (1991)

    Article  Google Scholar 

  7. Weiss, S.M., Kulikowski, C.A.: Computer Systems that learn, Classification and Prediction methods from Statistics, Neural Nets, Machine Learning and Expert Systems. Morgan Kaufmann Publishers, San Mateo (1991)

    Google Scholar 

  8. Bassili, J.: Facial motion in the perception of faces and of emotional expression. Journal of Experimental Psychology 4, 373–379 (1978)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2004 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Schweiger, R., Bayerl, P., Neumann, H. (2004). Neural Architecture for Temporal Emotion Classification. In: André, E., Dybkjær, L., Minker, W., Heisterkamp, P. (eds) Affective Dialogue Systems. ADS 2004. Lecture Notes in Computer Science(), vol 3068. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-24842-2_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-24842-2_5

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-22143-2

  • Online ISBN: 978-3-540-24842-2

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics