Abstract
This paper proposes to recognize and analyze expressive gestures using a descriptive motion language, the Laban Movement Analysis (LMA) method. We extract body features based on LMA factors which describe both quantitative and qualitative aspects of human movement. In the direction of our study, a dataset of 5 gestures performed with 4 emotions is created using the motion capture Xsens. We used two different approaches for emotions analysis and recognition. The first one is based on a machine learning method, the Random Decision Forest. The second approach is based on the human’s perception. We derive the most important features for each expressed emotion using the same methods, the RDF and the human’s ratings. We compared the results obtained from the automatic learning method against human perception in the discussion section.
Similar content being viewed by others
References
Ajili I, Mallem M, Didier J-Y (2017) Robust human action recognition system using laban movement analysis. Procedia Comput Sci 112(Supplement C):554–563. Knowledge-Based and Intelligent Information & Engineering Systems: Proceedings of the 21st International Conference, KES-20176-8, September 2017, Marseille, France
Ajili I, Mallem M, Didier JY (2017) Gesture recognition for humanoid robot teleoperation. In: 2017 26Th IEEE international symposium on robot and human interactive communication (RO-MAN), pp 1115–1120
Amaya K, Bruderlin A, Calvert T (1996) Emotion from motion. In: Proceedings of the Graphics Interface 1996 Conference, May 22-24, 1996, Toronto, Ontario, Canada, pp 222-229. Canadian Human-Computer Communications Society
Argyle M (1975) Bodily communication. Methuen Publishing Company, London
Aristidou A, Stavrakis E, Chrysanthou Y (2014) LMA-based motion retrieval for folk dance cultural heritage. Springer International Publishing, Cham, pp 207–216
Aristidou A, Stavrakis E, Papaefthimiou M, Papagiannakis G, Chrysanthou Y (2017) Style-based motion analysis for dance composition. The visual computer
Barakova EI, Lourens T (2010) Expressing and interpreting emotional movements in social games with robots. Pers Ubiquit Comput 14(5):457–467
Bradford Barber C, Dobkin DP, Huhdanpaa H (1996) The quickhull algorithm for convex hulls. ACM Trans Math Softw 22(4):469–483
Breiman L (1996) Bagging predictors. Mach Learn 24(2):123–140
Breiman L (2001) Random forests. Mach Learn 45(1):5–32
Chen LS, Huang TS (2000) Emotional expressions in audiovisual human computer interaction. In: 2000 Proceedings IEEE International conference on multimedia and expo. ICME2000. Latest advances in the fast changing world of multimedia (cat. no.00TH8532), vol. 1, pp 423–426
Chi D, Costa M, Zhao L, Badler N (2000) The emote model for effort and shape. In: Proceedings of the 27th Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH ’00, pp 173–182, New York, NY, USA. ACM Press/Addison-Wesley Publishing Co
Cimen G, Ilhan H, Capin T, Gurcay H (2013) Classification of human motion based on affective state descriptors. Comput Anim Virtual Worlds 24(3-4):355–363
de Gelder B (2009) Why bodies twelve reasons for including bodily expressions in affective neuroscience. Philos Trans R Soc, B 364(1535):3475–3484
De Silva LC, Ng PC (2000) Bimodal emotion recognition. In: Proceedings 4th IEEE international conference on automatic face and gesture recognition (Cat. No. PR00580), pp 332–335
Dietterich TG (2000) Ensemble methods in machine learning. In: Multiple classifier systems. Springer, Berlin, pp 1–15
Durupinar F, Kapadia M, Deutsch S, Neff M, Badler NI (2016) Perform: Perceptual approach for adding ocean personality to human motion using laban movement analysis. ACM Trans Graph, 36(1)
Ho TK (1998) The random subspace method for constructing decision forests. IEEE Trans Pattern Anal Mach Intell 20(8):832–844
Kamaruddin N, Wahab A (2010) Driver behavior analysis through speech emotion understanding. In: 2010 IEEE Intelligent vehicles symposium, pp 238–243
Kanade T, Cohn JF, Tian Y (2000) Comprehensive database for facial expression analysis. In: Proceedings 4th IEEE international conference on automatic face and gesture recognition (Cat. No. PR00580), pp 46–53
Kapadia M, Chiang I-k, Thomas T, Badler NI, Kider Jr. JT (2013) Efficient motion retrieval in large motion databases. In: Proceedings of the ACM SIGGRAPH symposium on interactive 3D graphics and games, I3D ’13, pp 19–28, New York, NY, USA. ACM
Kleinsmith A, Bianchi-Berthouze N (2013) Affective body expression perception and recognition: a survey. IEEE Trans Affect Comput 4(1):15–33
Kwon YH, da Vitoria Lobo N (1994) Age classification from facial images. In: 1994 Proceedings of IEEE conference on computer vision and pattern recognition, pp 762–767
Lanitis A, Draganova C, Christodoulou C (2004) Comparing different classifiers for automatic age estimation. IEEE Trans Syst Man Cybern B Cybern 34(1):621–628
Liu Y, Nie L, Han L, Zhang L, Rosenblum DS (2016) Action2activity: Recognizing complex activities from sensor data. CoRR, arXiv:1611.01872
Liu Y, Nie L, Liu L, Rosenblum DS (2016) From action to activity: Sensor-based activity recognition. Neurocomputing 181:108–115. Big data driven intelligent transportation systems
Lourens T, van Berkel R, Barakova E (2010) Communicating emotions and mental states to robots in a real time parallel framework using laban movement analysis. Robot Auton Syst 58(12):1256–1265. Intelligent Robotics and Neuroscience
Masuda M, Kato S (2010) Motion rendering system for emotion expression of human form robots based on laban movement analysis. In: 19Th international symposium in robot and human interactive communication, pp 324–329
Montepare J, Koff E, Zaitchik D, Albert M (1999) The use of body movements and gestures as cues to emotions in younger and older adults. J Nonverbal Behav 6(2):133–152
Pantic M, Rothkrantz LJM (2003) Toward an affect-sensitive multimodal human-computer interaction. Proc IEEE 91(9):1370–1390
Ramanathan N, Chellappa R (2006) Face verification across age progression. IEEE Trans Image Process 15(11):3349–3361
Shafir T, Tsachor RP, Welch KB (2016) Emotion regulation through movement: Unique sets of movement characteristics are associated with and enhance basic emotions. Front Psychol 6:2030
Tavakol M, Dennick R (2011) Making sense of cronbach’s alpha. Int J Med Educ 2:53–55
von Laban R, Ullmann L (1980) The mastery of movement. Macdonald and Evans, Evans
Wallbott HG (1998) Bodily expression of emotion. Eur J Soc Psychol 28(6):879–896
Xsens technologies. 2005-2014
Zacharatos H, Gatzoulis C, Chrysanthou Y, Aristidou A (2013) Emotion recognition for exergames using laban movement analysis. In: Proceedings of Motion on Games, MIG ’13, pp 39:61–39:66, New York, NY, USA. ACM
Zhao L, Badler NI (2005) Acquiring and validating motion qualities from live limb gestures. Graph Model 67(1):1–16
Acknowledgements
We would like to thank the staff of the University of Evry Val d’Essonne for participating in our dataset. As well, we would like to thank Mrs. Alice Jourlin for help with data gathering and tabulation. This work was partially supported by the Strategic Research Initiatives project iCODE accredited by University Paris Saclay.
Funding
This study was funded by the Strategic Research Initiatives project iCODE, University Paris Saclay.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interests
Author Insaf Ajili declares that she has no conflict of interest. Author Zahra Ramezanpanah declares that she has no conflict of interest. Author Malik Mallem declares that he has no conflict of interest. Author Jean Yves Didier declares that he has no conflict of interest.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Ajili, I., Ramezanpanah, Z., Mallem, M. et al. Expressive motions recognition and analysis with learning and statistical methods. Multimed Tools Appl 78, 16575–16600 (2019). https://doi.org/10.1007/s11042-018-6893-5
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11042-018-6893-5