DOI QR코드

DOI QR Code

Development of FACS-based Android Head for Emotional Expressions

감정표현을 위한 FACS 기반의 안드로이드 헤드의 개발

  • Choi, Dongwoon (Applied Robot R&D Department, Korea Institute of Industrial Technology) ;
  • Lee, Duk-Yeon (Applied Robot R&D Department, Korea Institute of Industrial Technology) ;
  • Lee, Dong-Wook (Applied Robot R&D Department, Korea Institute of Industrial Technology)
  • 최동운 (한국생산기술연구원 로봇응용연구부문) ;
  • 이덕연 (한국생산기술연구원 로봇응용연구부문) ;
  • 이동욱 (한국생산기술연구원 로봇응용연구부문)
  • Received : 2020.06.02
  • Accepted : 2020.07.21
  • Published : 2020.07.30

Abstract

This paper proposes the creation of an android robot head based on the facial action coding system(FACS), and the generation of emotional expressions by FACS. The term android robot refers to robots with human-like appearance. These robots have artificial skin and muscles. To make the expression of emotions, the location and number of artificial muscles had to be determined. Therefore, it was necessary to anatomically analyze the motions of the human face by FACS. In FACS, expressions are composed of action units(AUs), which work as the basis of determining the location and number of artificial muscles in the robots. The android head developed in this study had servo motors and wires, which corresponded to 30 artificial muscles. Moreover, the android head was equipped with artificial skin in order to make the facial expressions. Spherical joints and springs were used to develop micro-eyeball structures, and the arrangement of the 30 servo motors was based on the efficient design of wire routing. The developed android head had 30-DOFs and could express 13 basic emotions. The recognition rate of these basic emotional expressions was evaluated at an exhibition by spectators.

본 논문에서는 FACS(Facial Action Coding System)에 기반을 둔 안드로이드 로봇 헤드의 개발을 통한 감정표현 방법을 제안한다. 안드로이드 로봇은 인간과 가까운 외모를 가진 로봇을 말하며, 인공 피부, 인공 근육을 가지고 있다. 감정 표현을 로봇으로 구현하기 위해서는 인공 근육의 개수와 배치를 정하여야 하는데 이를 위해 인간의 얼굴 움직임을 해부학적으로 분석하였다. FACS는 해부학을 기반으로 하여 표정을 만들 때의 얼굴의 움직임을 분석한 시스템이다. FACS에서는 표정은 AU(Action Unit)의 조합으로 만들어지며, 이 AU를 기반으로 로봇의 인공 근육의 수와 위치를 정하게 된다. 개발된 안드로이드 헤드는 30개의 인공 근육에 해당되는 모터와 와이어를 가지고 있으며, 표정 구현을 위한 인공 피부를 가지고 있다. 제한된 머리 공간에 많은 모터를 탑재하기 위해 spherical joint와 스프링을 이용하여 초소형 안구 모듈이 개발되었고, 와이어 경로의 효율적인 설계를 기반으로 30개의 모터가 배치되었다. 제작된 안드로이드 헤드는 30 자유도를 가지고 13개의 기본 감정 표현을 구현 가능하였고, 전시회에서 일반 관람객들을 대상으로 인식률을 평가 받았다.

Keywords

References

  1. C. Breazeal, "Toward sociable robots," Robotics and Autonomous Systems, Vol.42, Issues 3-4, pp.167-175, March, 2003, https://doi.org/10.1016/S0921-8890(02)00373-1.
  2. H. Miwa, K. Itoh, M. Matsumoto, M. Zecca, H. Takanobu, S. Roccella, M.C. Carrozza, P. Dario, and A. Takanishi, "Effective Emotional Expressions with Emotion Expression Humanoid Robot WE-4RII: integration of humanoid robot hand RCH-1," Proceedings of International Conference on Intelligent Robots and Systems, Sendai, Japan, pp.2203-2208, 2004, doi:10.1109/IROS.2004.1389736.
  3. H. Ahn, P. Kim, J. Choi, S.B. Mansoor, W. Kang, S. Yoon, J. Na, Y. Baek, H. Chang, D. Song, J. Choi, and H. Ko, "Emotional Head Robot with Behavior Decision Model and Face Recognition," Proceedings of International Conference on Control, Automation and Systems, Seoul, Korea, pp.2719-2724, 2007, doi:10.1109/ICCAS.2007.4406829.
  4. H. Ahn, D. Lee, D. Choi, D. Lee, H. Lee, and M. Baeg, "Development of an incarnate announcing robot system using emotional interaction with human," International Journal of Humanoid Robotics, Vol.10, No.2, pp.1350017-1- 1350017-24, 2013, https://doi.org/10.1142/S0219843613500175.
  5. S. Nishio, H. Ishiguro, and N. Hagita, "Geminoid: Teleoperated Android of an Existing Person," Humanoid Robots: New Developments, INTECH, pp.343-352, 2007.
  6. P. Ekman, E.L. Rosenberg, "What the face reveals: Basic and applied studies of spontaneous expression using the Facial Action Coding System (FACS)", Oxford University Press, 1997.
  7. N. Endo, S. Momoki, M. Zecca, M. Saito, Y. Mizoguchi, K. Itoh, and A. Takanishi "A Development of whole-body emotion expression humanoid robot," Proceedings of International Conference on Control, Automation and Systems, Pasadena, CA, USA, pp.19-23, 2008, doi:10.1109/ROBOT.2008.4543523.
  8. P. Ekman, W.V. Friesen, M. O'Sullivan, A. Chan, I. DiacoyanniTarlatzis, K.G. Heider, R. Krause, W. A. LeCompte, T. Pitcairn, P. E. Ricci Bitti, "Universals and cultural differences in the judgments of facial expressions of emotion," Journal of Personality and Social Psychology, Vol.53, No.4, pp.712-717, 1987, https://doi.org/10.1037/0022-3514.53.4.712.
  9. B. Fehr, J.A. Russell, "Concept of emotion viewed from a prototype perspective," Journal of Experimental Psychology, Vol.113, pp.464-486, 1984, https://doi.org/10.1037/0096-3445.113.3.464.
  10. C.E. Izard, "Human Emotions," Springer Science+Business Media New York, 1977, doi:10.1007/978-1-4899-2209-0.
  11. S.S. Tomkins, Affect, imagery consciousness, Springer Publishing Company. Inc, Vol.l, 1962.
  12. J. Panksepp, "Toward a general psychobiological theory of emotions," Journal of Behavioral and Brain Science, Vol.5, pp.407-468, 1982, https://doi.org/10.1017/S0140525X00012863.
  13. L.A. Sroufe, "Socioemotional development," Handbook of infant development, 1979.
  14. H, Ahn, D, Lee, D, Choi, D, Lee, M, Hur, and H, Lee, "Appropriate emotions for facial expressions of 33-DOFs android head EveR-4 H33," Proceedings of International Conference on Robot and Human Interactive Communication, Paris, France, pp.1115-1120, 2012, doi:10.1109/ROMAN.2012.6343898.