Abstract
This paper presents a multimodal database developed within the EU-funded project MYSELF. The project aims at developing an e-learning platform endowed with affective computing capabilities for the training of relational skills through interactive simulations. The database includes data coming from 34 participants and concerning physiological parameters, vocal nonverbal features, facial expression and posture. Ten different emotions were considered (anger, joy, sadness, fear, contempt, shame, guilt, pride, frustration and boredom), ranging from primary to self-conscious emotions of particular relevance in learning process and interpersonal relationships. Preliminary results and analyses are presented, together with directions for future work.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Picard, R.W.: Affective computing. The MIT Press, Cambridge (1997)
Prendinger, H., Dohi, H., Wang, H., Mayer, S., Ishizuka, M.: Empathic embodied interfaces: Addressing users’ affective state Tutorial and Research. In: André, E., Dybkjær, L., Minker, W., Heisterkamp, P. (eds.) ADS 2004. LNCS (LNAI), vol. 3068, pp. 53–64. Springer, Heidelberg (2004)
Scheirer, J., Fernandez, R., Klein, J., Picard, R.W.: Frustrating the user on purpose: A step toward building an affective computer. Interacting with computers 14, 93–118 (2002)
Chen, L.S., Huang, T.S., Miyasato, T., Nakatsu, R.: Multimodal human emotion/ expression recognition. In: Proceedings of International Conference on Automatic Face and Gesture Recognition, Nara, Japan. IEEE Computer Society, Los Alamitos (1998)
Kapoor, A., Picard, R.W., Ivanov, Y.: Probabilistic combination of multiple modalities to detect interest. In: Proceedings of International Conference on Pattern Recognition, Cambridge, England (2004)
Oudeyer, P.-Y.: The production and recognition of emotions in speech: features and algorithms. International Journal of Human Computer Interaction 59, 157–183 (2003)
Batliner, A., Fischer, K., Huber, R., Spilker, J., Nöth, E.: How to find trouble in communication. Speech Communication 40, 117–143 (2003)
Pantic, M., Rothkrantz, L.J.M.: Toward an Affect Sensitive Multimodal Human-Computer Interaction. Proceedings of the IEEE 91(9), 1370–1390 (2003)
Douglas-Cowie, E., et al.: Deliverable D5c-Preliminary plans for exemplars: Databases. Project Deliverable of Humaine Network of Excellence (2004) Available online at http://emotion-research.net/deliverables/
Amir, N., Ron, S., Laor, N.: Analysis of an emotional speech corpus in Hebrew based on objective criteria. In: Proceedings of ISCA ITRW on speech and emotion, Newcastle, Belfast, Textflow, pp. 29–33 (2000)
McMahon, E., Cowie, R., Kasderidis, S., Taylor, J., Kollias, S.: What chance that a DC could recognise hazardous mental states from sensor outputs? In: Tales of the disappearing computer, Santorini (2003)
Campbell, W.N.: Recording Techniques for capturing natural everyday speech. In: Proceedings of LREC, Las Palmas (2002)
Douglas-Cowie, E., Campbell, N., Cowie, R., Roach, P.: Emotional speech: towards a new generation of databases. Speech Communication 40, 33–60 (2003)
Devillers, L., Vasilescu, I., Vidrascu, L.: Anger versus Fear detection in recorded conversations. Speech Prosody (2004)
Cowie, R., Cornelius, R.: Describing the emotional states that are expressed in speech. Speech Communication 40, 5–32 (2003)
Banse, R., Scherer, K.R.: Acoustic profiles in vocal emotion expression. Journal of personality and social psychology 70(3), 614–636 (1996)
Ekman, P., Friesen, W.V.: The Facial Action Coding System: A technique for measurement of facial movement. Consulting Psychologists Press, Palo Alto (1978)
Bachorowski, J.A.: Vocal expression and perception of emotion. Current Directions in Psychological Science 8, 53–57 (1999)
Douglas-Cowie, E., Cowie, R., Schroeder, M.: A new emotion database: considerations, sources and scope. In: Proceedings of ISCA ITRW on speech and emotion, Newcastle, Belfast, Textflow, pp. 39–44.
Iriondo, I., Guaus, R., Rodriguez, A., Lazaro, P., Montoya, N., Blanco, J., Beradas, D., Oliver, J., Tena, D., Longhi, L.: Validation of an acoustical modelling of emotional expression in Spanish using speech synthesis techniques. In: Proceedings of ISCA ITRW on speech and emotion, Newcastle, Belfast, Textflow, September 5-7, pp. 161–166 (2000)
Velten, E.: A laboratory task for induction of mood states. Behaviour research and therapy 6, 473–482 (1968)
Anolli, L., Duncan Jr., S., Magnusson, M., Riva, G. (eds.): The hidden structure of interaction: From neurons to culture patterns. IOS Press, Amsterdam (2005)
D’Mello, S.K., Craig, S.D., Gholson, B., Franklin, S., Picard, R.W., Graesser, A.C.: Integrating affect sensors in an intelligent tutoring system. In: Affective Interactions: The Computer in the Affective Loop, Workshop at 2005 International conference on Intelligent User Interfaces, pp. 7–13. AMC Press, New York (2005)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Anolli, L. et al. (2005). A Multimodal Database as a Background for Emotional Synthesis, Recognition and Training in E-Learning Systems. In: Tao, J., Tan, T., Picard, R.W. (eds) Affective Computing and Intelligent Interaction. ACII 2005. Lecture Notes in Computer Science, vol 3784. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11573548_73
Download citation
DOI: https://doi.org/10.1007/11573548_73
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-29621-8
Online ISBN: 978-3-540-32273-3
eBook Packages: Computer ScienceComputer Science (R0)