ABSTRACT
Humans communicate their affective states through different media, both verbal and non-verbal, often used at the same time. The knowledge of the emotional state plays a key role to provide personalized and context-related information and services. This is the main reason why several algorithms have been proposed in the last few years for the automatic emotion recognition. In this work we exploit the correlation between one's affective state and the simultaneous body expressions in terms of speech and gestures. Here we propose a system for real-time emotion recognition from gestures. In a first step, the system builds a trusted dataset of association pairs (motion data → emotion pattern), also based on textual information. Such dataset is the ground truth for a further step, where emotion patterns can be extracted from new unclassified gestures. Experimental results demonstrate a good recognition accuracy and real-time capabilities of the proposed system.
- Oryina Kingsley Akputu, Kah Phooi Seng, and Yun Li Lee. 2016. Affect Recognition for Web 2.0 Intelligent E-Tutoring Systems: Exploration of Students' Emotional Feedback. In Psychology and Mental Health: Concepts, Methodologies, Tools, and Applications. IGI Global, 818--848.Google Scholar
- Anthony P Atkinson, Mary L Tunstall, and Winand H Dittrich. 2007. Evidence for Distinct Contributions of Form and Motion Information to the Recognition of Emotions from Body Gestures. Cognition 104, 1 (2007), 59--72.Google ScholarCross Ref
- Tobias Baur, Dominik Schiller, and Elisabeth André. 2016. Modeling Userfis Social Attitude in a Conversational System. In Emotions and Personality in Personalized Services. Springer, 181--199.Google Scholar
- Donald J Berndt and James Clifford. 1994. Using Dynamic Time Warping to Find Patterns in Time Series. In KDD workshop, Vol. 10. Seattle, WA, 359--370. Google ScholarDigital Library
- Arif Budiman, Mohamad Ivan Fanany, and Chan Basaruddin. 2014. Constructive, robust and adaptive OS-ELM in human action recognition. In Industrial Automation, Information and Communications Technology (IAICT), 2014 International Conference on. IEEE, 39--45.Google Scholar
- Lea Canales and Patricio Martínez-Barco. 2014. Emotion Detection from Text: A Survey. Processing in the 5th Information Systems Research Working Days (JISIC 2014) (2014), 37.Google Scholar
- Josep Maria Carmona and Joan Climent. 2012. A Performance Evaluation of HMM and DTW for Gesture Recognition. In Iberoamerican Congress on Pattern Recognition. Springer, 236--243.Google ScholarCross Ref
- Ginevra Castellano, Loic Kessous, and George Caridakis. 2008. Emotion Recognition through Multiple Modalities: Face, Body Gesture, Speech. Affect and emotion in human-computer interaction (2008), 92--103.Google Scholar
- Berardina De Carolis, Marco de Gemmis, Pasquale Lops, and Giuseppe Palestra. 2017. Recognizing Users Feedback from Non-Verbal Communicative Acts in Conversational Recommender Systems. Pattern Recognition Letters (2017).Google Scholar
- Marilena Ditta, Fabrizio Milazzo, Valentina Raví, Giovanni Pilato, and Agnese Augello. 2015. Data-driven Relation Discovery from Unstructured Texts. In 7th International Joint Conference on Knowledge Discovery, Knowledge Engineering and Knowledge Management (IC3K), Vol. 1. IEEE, 597--602. Google ScholarDigital Library
- Paul Ekman. 2004. Emotional and Conversational Nonverbal Signals. In Language, knowledge, and representation. Springer, 39--50.Google Scholar
- Paul Ekman. 2005. Basic Emotions. John Wiley & Sons, Ltd, 45--60.Google Scholar
- Paul Ekman and Erika L Rosenberg. 1997. What the face reveals: Basic and applied studies of spontaneous expression using the Facial Action Coding System (FACS). Oxford University Press, USA.Google Scholar
- Sergio Escalera, Jordi Gonzàlez, Xavier Baró, Miguel Reyes, Oscar Lopes, Isabelle Guyon, Vassilis Athitsos, and Hugo Escalante. 2013. Multi-Modal Gesture Recognition Challenge 2013: Dataset and Results. In Proceedings of the 15th ACM on International conference on multimodal interaction. ACM, 445--452. Google ScholarDigital Library
- Andrea Esuli and Fabrizio Sebastiani. 2006. Sentiwordnet: A Publicly Available Lexical Resource for Opinion Mining. In Proceedings of LREC, Vol. 6. Citeseer, 417--422.Google Scholar
- Peter Gärdenfors. 2004. Conceptual Spaces: The geometry of Thought. MIT press.Google Scholar
- Vito Gentile, Fabrizio Milazzo, Salvatore Sorce, Antonio Gentile, Agnese Augello, and Giovanni Pilato. 2017. Body Gestures and Spoken Sentences: A Novel Approach for Revealing Userfis Emotions. In IEEE 11th International Conference on Semantic Computing (ICSC). IEEE, 69--72.Google Scholar
- Vito Gentile, Salvatore Sorce, and Antonio Gentile. 2014. Continuous Hand Openness Detection Using a Kinect-like Device. In Eighth International Conference on Complex, Intelligent and Software Intensive Systems (CISIS). IEEE, 553--557. Google ScholarDigital Library
- Vito Gentile, Salvatore Sorce, Alessio Malizia, and Antonio Gentile. 2016. Gesture recognition using low-cost devices: Techniques, applications, perspectives {Riconoscimento di gesti mediante dispositivi a basso costo: Tecniche, applicazioni, prospettive}. Mondo Digitale 15, 63 (2016). http://mondodigitale.aicanet.net/2016-2/articoli/02_Riconoscimento_di_gesti_mediante_dispositivi_a_basso_costo.pdfGoogle Scholar
- Donald Glowinski, Antonio Camurri, Gualtiero Volpe, Nele Dael, and Klaus Scherer. 2008. Technique for Automatic Emotion Recognition by Body Gesture Analysis. In IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops (CVPRW'08). IEEE, 1--6.Google ScholarCross Ref
- Asha Kapur, Ajay Kapur, Naznin Virji-Babul, George Tzanetakis, and Peter F. Driessen. 2005. Gesture-based Affective Computing on Motion Capture Data. In International Conference on Affective Computing and Intelligent Interaction. Springer, 1--7. Google ScholarDigital Library
- Chatchai Kasemtaweechok and Worasait Suwannik. 2015. Training set reduction using Geometric Median. In 15th International Symposium on Communications and Information Technologies (ISCIT). IEEE, 153--156.Google ScholarCross Ref
- Loic Kessous, Ginevra Castellano, and George Caridakis. 2010. Multimodal Emotion Recognition in Speech-based Interaction Using Facial Expression, Body Gesture and Acoustic Analysis. Journal on Multimodal User Interfaces 3, 1-2 (2010), 33--48.Google ScholarCross Ref
- Andrea Kleinsmith and Nadia Bianchi-Berthouze. 2013. Affective Body Expression Perception and Recognition: A Survey. IEEE Transactions on Affective Computing 4, 1 (2013), 15--33. Google ScholarDigital Library
- Fabrizio Milazzo, Vito Gentile, Antonio Gentile, and Salvatore Sorce. 2017. Real-Time Body Gestures Recognition Using Training Set Constrained Reduction. In Conference on Complex, Intelligent, and Software Intensive Systems. Springer, 216--224.Google Scholar
- Fabrizio Milazzo, Vito Gentile, Giuseppe Vitello, Antonio Gentile, and Salvatore Sorce. 2017. Modular Middleware for Gestural Data and Devices Management. Journal of Sensors 2017 (2017).Google Scholar
- Saif Mohammad. 2011. From Once Upon a Time to Happily Ever After: Tracking Emotions in Novels and Fairy Tales. In Proceedings of the 5th ACL-HLT Workshop on Language Technology for Cultural Heritage, Social Sciences, and Humanities. Association for Computational Linguistics, 105--114. Google ScholarDigital Library
- Seung-Bo Park, Eunsoon Yoo, Hyunsik Kim, and Geun-Sik Jo. 2011. Automatic Emotion Annotation of Movie Dialogue using WordNet. In Asian Conference on Intelligent Information and Database Systems. Springer, 130--139. Google ScholarDigital Library
- Giovanni Pilato and Umberto Maniscalco. 2015. Soft Sensors for Social Sensing in Cultural Heritage. In 2015 Digital Heritage, Vol. 2. IEEE, 749--750.Google ScholarCross Ref
- Soujanya Poria, Alexander Gelbukh, Erik Cambria, Peipei Yang, Amir Hussain, and Tariq Durrani. 2012. Merging SenticNet and WordNet-Affect emotion lists for sentiment analysis. In IEEE 11th International Conference on Signal Processing (ICSP), Vol. 2. IEEE, 1251--1255.Google ScholarCross Ref
- José Salvador Sánchez. 2004. High Training Set Size Reduction by Space Partitioning and Prototype Abstraction. Pattern Recognition 37, 7 (2004), 1561--1564.Google ScholarCross Ref
- Pavel Senin. 2008. Dynamic Time Warping Algorithm Review. Information and Computer Science Department University of Hawaii at Manoa Honolulu, USA 855 (2008), 1--23.Google Scholar
- Jamie Shotton, Toby Sharp, Alex Kipman, Andrew Fitzgibbon, Mark Finocchio, Andrew Blake, Mat Cook, and Richard Moore. 2013. Real-Time Human Pose Recognition in Parts From Single Depth Images. Commun. ACM 56, 1 (2013), 116--124. Google ScholarDigital Library
- Marina Sokolova and Guy Lapalme. 2009. A Systematic Analysis of Performance Measures for Classification Tasks. Information Processing & Management 45, 4 (2009), 427--437. Google ScholarDigital Library
- Thad Starner, Jake Auxier, Daniel Ashbrook, and Maribeth Gandy. 2000. The Gesture Pendant: A Self-illuminating, Wearable, Infrared Computer Vision System for Home Automation Control and Medical Monitoring. In the fourth international symposium on Wearable computers. IEEE, 87--94. Google ScholarDigital Library
- Carlo Strapparava, Alessandro Valitutti, et al. 2004. WordNet Affect: an Affective Extension of WordNet. In LREC, Vol. 4. 1083--1086.Google Scholar
- Cynthia Whissell. 1989. The Dictionary of Affect in Language. Emotion: Theory, research, and experience 4, 113--131 (1989), 94.Google Scholar
- Zhaojun Yang and Shrikanth S Narayanan. 2014. Analysis of Emotional Effect on Speech-Body Gesture Interplay. In INTERSPEECH. 1934--1938.Google Scholar
Index Terms
- Exploiting Correlation between Body Gestures and Spoken Sentences for Real-time Emotion Recognition
Recommendations
Recognizing multiple emotion from ambiguous facial expressions on mobile platforms
Extracting and understanding of emotion is of high importance for the interaction between human and machine communication systems. The most expressive way to display the human's emotion is through facial expression analysis. This paper proposes a ...
Generalized zero-shot emotion recognition from body gestures
AbstractIn human-human interaction, body language is one of the most important emotional expressions. However, each emotion category contains abundant emotional body gestures, and basic emotions used in most researches are difficult to describe complex ...
EmotiW 2016: video and group-level emotion recognition challenges
ICMI '16: Proceedings of the 18th ACM International Conference on Multimodal InteractionThis paper discusses the baseline for the Emotion Recognition in the Wild (EmotiW) 2016 challenge. Continuing on the theme of automatic affect recognition `in the wild', the EmotiW challenge 2016 consists of two sub-challenges: an audio-video based ...
Comments