skip to main content
10.1145/1941007.1941035acmotherconferencesArticle/Chapter ViewAbstractPublication PagesihmConference Proceedingsconference-collections
research-article

Reconnaissance d'Emotions: un point de vue interaction multimodale

Authors Info & Claims
Published:20 September 2010Publication History

ABSTRACT

Analysis of emotion recognition is a young but maturing research field, for which there is an emerging need for engineering models and in particular design models. Addressing these engineering challenges of emotion recognition, we reuse and adapt results from the research field of multimodal interaction, since the expression of an emotion is intrinsically multimodal. In this paper, we refine the definition of an interaction modality for the case of passive emotion recognition. We also study the combination of modalities by applying the CARE properties. We highlight the benefits of our design model for emotion recognition.

References

  1. Bolt, R. A. "put-that-there": Voice and gesture at the graphics interface. In SIGGRAPH '80: Proceedings of the 7th annual conference on Computer graphics and interactive techniques, pages 262--270, New York, NY, USA, 1980. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Bouchet, J. Ingénierie de l'interaction multimodale en entrée: approche à composants ICARE. Ph.D. thesis, Université Joseph Fourier, Grenoble 1, 2006.Google ScholarGoogle Scholar
  3. Clay, A., Couture, N. and Nigay, L. Engineering affective computing: a unifying software architecture. In Proceedings of the 3rd IEEE International Conference on Affective Computing and Intelligent Interaction and Workshops, 2009. (ACII'09), pages 1--6, 2009.Google ScholarGoogle ScholarCross RefCross Ref
  4. Clay, A. La branche émotion, un modèle conceptuel pour l'intégration de la reconnaissance multimodale d'émotions dans des applications interactives: application au mouvement et à la danse augmentée. Ph.D. thesis, Université Bordeaux 1, Bordeaux, 2009.Google ScholarGoogle Scholar
  5. Gaines, B. R., Modeling and Forecasting the Information Sciences. Information Sciences 57--58, 1991, p. 3--22. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Jaimes, A. and Sebe, N. Multimodal human-computer interaction: A survey. Comput. Vis. Image Underst., 108(1--2):116--134, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Lisetti, C. L. Le paradigme MAUI pour des agents multimodaux d'interface homme machine socialement intelligents. Revue d'Intelligence Artificielle, Numéro Spécial sur les Interactions Emotionnelles, 20(4--5):583--606, 2006.Google ScholarGoogle Scholar
  8. Martin, J. C. TYCOON: Theoretical framework and software tools for multimodal interfaces. Intelligence and Multimodality in Multimedia interfaces, 1998.Google ScholarGoogle Scholar
  9. Nigay, L. and Coutaz, J. A generic platform for addressing the multimodal challenge. In CHI '95: Proceedings of the SIGCHI conference on Human factors in computing systems, pages 98--105, New York, NY, USA, 1995. ACM Press/Addison-Wesley Publishing Co. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. OpenInterface European project. IST Framework 6 STREP funded by the European Commission (FP6-35182). www.oi-project.org.Google ScholarGoogle Scholar
  11. Pantic, M, Sebe, N., Cohn, J. F. and Huang T. Affective multimodal human-computer interaction. In MULTIMEDIA '05: Proceedings of the 13th annual ACM international conference on Multimedia, pages 669--676, New York, NY, USA, 2005. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Scherer, K. R. On the nature and function of emotion: a component process approach. Approaches to emotion. NJ: Erlbaum, Hillsdale, k.r. scherer and p. ekman (eds.) edition, 1984.Google ScholarGoogle Scholar
  13. Scherer, K. R. Feelings integrate the central representation of appraisal-driven response organization in emotion. In Feelings and emotions: The Amsterdam symposium, pages 136--157, 2004.Google ScholarGoogle ScholarCross RefCross Ref
  14. Zeng, Z., Pantic, M., Roisman, G. I. and Huang, T. S. A survey of affect recognition methods: Audio, visual, and spontaneous expressions. IEEE Transactions on Pattern Analysis and Machine Intelligence, 31(1):39--58, 2009 Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Reconnaissance d'Emotions: un point de vue interaction multimodale

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Other conferences
        IHM '10: Proceedings of the 22nd Conference on l'Interaction Homme-Machine
        September 2010
        262 pages
        ISBN:9781450304108
        DOI:10.1145/1941007

        Copyright © 2010 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 20 September 2010

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Acceptance Rates

        Overall Acceptance Rate103of199submissions,52%
      • Article Metrics

        • Downloads (Last 12 months)1
        • Downloads (Last 6 weeks)0

        Other Metrics

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader