ABSTRACT
This paper proposes a gaze-communicative stuffed-toy robot system with joint attention and eye-contact reactions based on ambient gaze-tracking. For free and natural interaction, we adopted our remote gaze-tracking method. Corresponding to the user's gaze, the gaze-reactive stuffed-toy robot is designed to gradually establish 1) joint attention using the direction of the robot's head and 2) eye-contact reactions from several sets of motion. From both subjective evaluations and observations of the user's gaze in the demonstration experiments, we found that i) joint attention draws the user's interest along with the user-guessed interest of the robot, ii) "eye contact" brings the user a favorable feeling for the robot, and iii) this feeling is enhanced when "eye contact" is used in combination with "joint attention." These results support the approach of our embodied gaze-communication model.
- Arrington Research. Viewpoint eye-tracker. http://arringtonresearch.com/index.html.Google Scholar
- C. Breazeal, D. Buchsbaum, J. Gray, D. Gatenby, and B. Blumberg. Learning from and about others: Towards using imitation to bootstrap the social understanding of others by robots. Autonomous Robots, 11, 2005. Issues 1--2. Google ScholarDigital Library
- C. Castellini and G. Sndini. Gaze tracking for robotic control in intelligent teleoperation and prosthetics. In COGAIN 2006, pages 73--77, 2006.Google Scholar
- B. R. Duffy. Anthropomorphism and the social robot. Robotics and Autonomous Systems, 42(3):177--190, 2003.Google ScholarCross Ref
- A. Fukayama, T. Ohno, N. Mukawa, M. Sawaki, and N. Hagita. Messages embedded in gaze of interface agents -- impression management with agent's gaze--. In Proc. ACM SIGCHI2002, volume 1, pages 41--49, 2002. Google ScholarDigital Library
- M. Imai, T. Ono, and H. Ishiguro. Physical relation and expression: Joint attention for human--robot interaction. In IEEE Int. Workshop on Robot and Human Communication, pages 512--517, 2001.Google ScholarCross Ref
- A. Kendon. Some functions of gaze-direction in social interaction. Acta Psychologica, 26:22--63, 1967.Google ScholarCross Ref
- H. Kojima. Infanoid: A babybot that explores the social environment. Socially Intelligent Agents, K. Dautenhahn and A. H. Bond and L. Canamero and B. Edmonds, Kluwer Academic Publishers, pages 157--164, 2002.Google Scholar
- Y. Matsumoto and A. Zelinsky. An algorithm for real-time stereo vision implementation of head pose and gaze direction measurement. In Proc. Int. Conf. Automatic Face and Gesture Recognition, pages 499--504, 2000. Google ScholarDigital Library
- C. Moore, P. J. Dunham, and P. Dunham. Joint Attention: Its Origins and Role in Development. Lawrence Erlbaum, 1995.Google Scholar
- D. Moore. 'it's like a gold medal and it's mine' - dolls in dementia care. Journal of Dementia Care, Vol 9 No.6 2001, 9(6):20--22, 2001.Google Scholar
- nac image technology Inc. eye mark recorderEMR-8B. http://www.ceatec.com/en/2005/news/ne_web_detail.html?volume=15.Google Scholar
- T. Ohno, N. Mukawa, and S. Kawato. Just blink your eyes: A head-free gaze tracking system. In Proc. CHI2003, pages 950--951, 2003. Google ScholarDigital Library
- E. Pogalin, A. Redert, I. Patras, and E. Hendriks. Gaze tracking by using factorized likelihoods particle filtering and stereo vision. Proc. 3DPVT06, pages 57--64, 2006. Google ScholarDigital Library
- S. Ratjatawan. Tsunami children foundation (tcf) psychological support services, 2005. http://tsunamicf.org/psychological.htm.Google Scholar
- B. Scassellati. Theory of mind for a humanoid robot. Autonomous Robots, 12, 2002. Google ScholarDigital Library
- D. Sekiguchi, M. Inami, and S. Tachi. RobotPHONE: RUI for interpersonal communication. In CHI2001 Extended Abstracts, pages 277--278, 2001. Google ScholarDigital Library
- C. Sidner, C. Lee, C. D. Kidd, N. Lesh, and C. Rich. Explorations in engagement for humans and robots. Artificial Intelligence, 12, 2005. Issues 1--2. Google ScholarDigital Library
- A. L. Thomaz, M. Berlin, and C. Breazeal. An embodied computational model of social referencing. ROMAN 2005, 2005.Google ScholarCross Ref
- J. Wang, E. Sung, and R. Venkateswarlu. Estimating the eye gaze from one eye. Computer Vision and Image Understanding, 98(1):83--103, 2004. Google ScholarDigital Library
- H. Yamazoe, A. Utsumi, and S. Abe. Remote gaze direction estimation with a single camera based on facial-feature tracking. Asian Conference on Computer Vision (ACCV2007) Demo Session, 2007, to apper.Google Scholar
- T. Yonezawa, N. Suzuki, S. Abe, K. Mase, and K. Kogure. Crossmodal coordination of expressive strength between voice and gesture for personified media. Proc. of ICMI06, pages 43--50, 2006. Google ScholarDigital Library
- T. Yonezawa, H. Yamazoe, A. Utsumi, and S. Abe. Gazecoppet: Hierarchical gaze-communication in ambient space. In SIGGRAPH Proceedings DVD--ROM, 2007. Google ScholarDigital Library
- Y. Yoshikawa, K. Shinozawa, H. Ishiguro, N. Hagita, and T. Miyamoto. The effects of responsive eye movement and blinking behavior in a communication robot. In Proc. IROS2006, pages 4564--4569, 2006.Google ScholarCross Ref
Index Terms
- Gaze-communicative behavior of stuffed-toy robot with joint attention and eye contact based on ambient gaze-tracking
Recommendations
Active eye contact for human-robot communication
CHI EA '04: CHI '04 Extended Abstracts on Human Factors in Computing SystemsEye contact is an effective means of controlling communication for humans, such as starting communication. It seems that we can make eye contact if we look at each other. However, this alone cannot complete eye contact. In addition, we need to be aware ...
How does unintentional eye contact with a robot affect users' emotional attachment to it?: investigation on the effects of eye contact and joint attention on users' emotional attachment to a robot
UAHCI'13: Proceedings of the 7th international conference on Universal Access in Human-Computer Interaction: user and context diversity - Volume 2Eye contact behavior plays a significant role in establishing intimate interaction between a user and a robot. In this study, more specifically, we assumed that unintentional eye contact with a robot would make a person feel a stronger emotional ...
Museum Guide Robot by Considering Static and Dynamic Gaze Expressions to Communicate with Visitors
HRI'15 Extended Abstracts: Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction Extended AbstractsHuman eyes not only serve the function of enabling us "to see" something, but also perform the vital role of allowing us "to show" our gaze for non-verbal communication. We have investigated the static design and dynamic behaviors of robot heads for ...
Comments