ABSTRACT
This paper proposes a daily-partner robot, that is aware of the user's situation or behavior by using gaze and utterance detection. For appropriate and familiar anthropomorphic interaction, the robot should wait for a timing to talk something to the user corresponding to the situation of her/him while she/he doing a task or thinking. According to the need, our proposed robot i) estimates the user's context by detecting her/his gaze and utterance, such as the target of the user's speech, ii) tries to notify the need to speak to the user by silent (i.e. without making an utterance) gazeturns toward the user and joint attention with taking advantage of the attentiveness, and iii) tell the message when the user talks to the robot. The results of experiments combining subjects' daily tasks with/without the above steps show that the crossmodal-aware behaviors of the robot are important in respectful communications without disturbing the user's ongoing task by adopting silent behaviors showing the robot's intention to speak and for drawing the user's attention.
- R. W. Picard, Affective computing, MIT Press, 1997. Google ScholarDigital Library
- A. Kendon. Some functions of gaze-direction in social interaction. Acta Psychologica, 26:22--63, 1967.Google ScholarCross Ref
- C. Moore, P. J. Dunham, and P. Dunham. Joint Attention: Its Origins and Role in Development. Lawrence Erlbaum, 1995.Google Scholar
- C. Peters, C. Pelachaud, E. Bevacqua, M. Mancini, and I. Poggi, A Model of Attention and Interest Using Gaze Behavior, IVA2005, pp. 229--240, 2005. Google ScholarDigital Library
- T. Yonezawa, H. Yamazoe, A. Utsumi, and S. Abe. Gaze-communicative behavior of stuffed-toy robot with joint attention and eye contact based on ambient gaze-tracking, ICMI2007, pp. 140--145, 2007. Google ScholarDigital Library
- A. Fukayama, T. Ohno, N. Mukawa, M. Sawaki, and N. Hagita. Messages embedded in gaze of interface agents -- impression management with agent's gaze-. In Proc. ACM SIGCHI2002, vol. 1, pp. 41--49, 2002. Google ScholarDigital Library
- H. Kojima. Infanoid: A babybot that explores the social environment. Socially Intelligent Agents, K. Dautenhahn and A. H. Bond and L. Canamero and B. Edmonds, Kluwer Academic Publishers, pp. 157--164, 2002.Google Scholar
- B. Mutlu. The design of gaze behavior for embodied social interfaces, CHI'08 extended abstracts, pp. 2661--2664, 2008. Google ScholarDigital Library
- Y. Yoshikawa, K. Shinozawa, H. Ishiguro, N. Hagita, and T. Miyamoto. The effects of responsive eye movement and blinking behavior in a communication robot. In Proc. IROS2006, pp. 4564--4569, 2006.Google ScholarCross Ref
- H. Yamazoe, A. Utsumi, T. Yonezawa, and S. Abe. Remote Gaze Estimation with a Single Camera Based on Facial-Feature Tracking without Special Calibration Actions, ETRA2008, pp. 245--250, 2008. Google ScholarDigital Library
- D. B. Koons, C. J. Sparrell, and K. R. Thorisson. Integrating Simultaneous Input from Speech, Gaze, and Hand Gestures, Intelligent multimedia interfaces, MIT Press, pp. 257--276, 1993. Google ScholarDigital Library
- B. Schilit, N. Adams, and R. Want. Context-aware computing applications, WMCSA'94, pp. 89--101, 1994. Google ScholarDigital Library
- T. Selker and W. Burleson. Context-Aware Design and Interaction in Computer Systems, Workshop on Software Engineering for Wearable and Pervasive Computing, 2000.Google ScholarDigital Library
- P. P. Maglio and C. S. Campbell, Gaze and Speech in Attentive User Interfaces, ICMI2000, pp. 1--7, 2000. Google ScholarDigital Library
- P. P. Maglio, T. Matlock, C. S. Campbell, S. Zhai, and B. A. Smith. Attentive agents, Communications of the ACM, 46(3), pp. 47--51, 200.3. Google ScholarDigital Library
- T. Selker. Visual attentive interfaces, BT Technology Journal, 22(4), pp. 146--150, 2004. Google ScholarDigital Library
- L. Karl, M. Pettey, and B. Shneiderman, Speech-Activated versus Mouse-Activated Commands for Word Processing Applications: An Empirical Evaluation International Journal on Man-Machine Studies, 1993.Google Scholar
- L. E. Sibert and R. Jacob. Evaluation of eye gaze interaction, CHI2000, pp. 281--288, 2000. Google ScholarDigital Library
- E. Castellina, F. Corno and P. Pellegrino. Integrated Speech and Gaze Control for Realistic Desktop Environments, ETRA2008, pp. 79--82, 2008. Google ScholarDigital Library
- D. Miniotas, O. Spakov, I. Tugoy, and I. S. MacKenzie. Speech-Augmented Eye Gaze Interaction with Small Closely Spaced Targets, ETRA2006, pp. 67--72, 2006. Google ScholarDigital Library
- M. Imai, T. Ono, and H. Ishiguro. Physical relation and expression: Joint attention for human-robot interaction. In IEEE Int. Workshop on Robot and Human Communication, pp. 512--517, 2001.Google ScholarCross Ref
- I. Haritaoglu, A. Cozzi, D. Kons, M. Flikner, D. Zotkin, R. Duraiswami, and Y. Yacoob. Attentive toys, ICME2001, pp. 1124--1127, 2001.Google ScholarCross Ref
- T. Yonezawa, H. Yamazoe, A. Utsumi, and S. Abe. GazeRoboard: Gaze-communicative Guide System in Daily Life on Stuffed-toy Robot with Interactive Display Board, IROS2008, pp. 1204--1209, 2008.Google ScholarCross Ref
- C. L. Bethel and R. R. Murphy Affective expression in appearance constrained robots Proc. of ACM SIGCHI/SIGART conference on Human-robot interaction, pp. 327--328, 2006. Google ScholarDigital Library
- D. Sekiguchi, M. Inami, and S. Tachi. RobotPHONE: RUI for interpersonal communication. In CHI2001 Extended Abstracts, pp. 277--278, 2001. Google ScholarDigital Library
- S. Basu, B. Clarkson, and A. Pentland. Smart Headphones: Enancing Auditory Awareness through Robust Speech Detection and Source Localization, ICASSP01, Vol. 5, pp. 3361--3364, 2001. Google ScholarDigital Library
- Applied Science Laboratories. Mobile Eye: Lightweight Tetherless Eye Tracking, http://www.a-s-l.com/products/mobileeye.htmGoogle Scholar
- R. Newman, Y. Matsumoto, S. Rougeaux, and A. Zelinsky. Real-time stereo tracking for head pose and gaze estimation In Proc. Int. Conf. Automatic Face and Gesture Recognition, (FG2000), pp. 122--128, 2000. Google ScholarDigital Library
- T. Ohno, N. Mukawa, and S. Kawato. Just blink your eyes: A head-free gaze tracking system. In Proc. CHI2003, pp. 950--951, 2003. Google ScholarDigital Library
- C. H. Morimoto and M. R. M. Mimica. Eye gaze tracking techniques for interactive applications, Computer Vision and Image Understanding, 98(1), pp. 4--24, 2005.Google ScholarCross Ref
- J. Wang, E. Sung, and R. Venkateswarlu. Estimating the eye gaze from one eye, Computer Vision and Image Understanding, 98(1), Pages 83--103, 2005.Google Scholar
Index Terms
- Evaluating crossmodal awareness of daily-partner robot to user's behaviors with gaze and utterance detection
Recommendations
Anthropomorphic awareness of partner robot to user's situation based on gaze and speech detection
This paper introduces a daily-partner robot, that is aware of the user's situation by using gaze and utterance detection. For appropriate anthropomorphic interaction, the robot should talk to the user in proper timing without interrupting her/...
Gaze awareness in conversational agents: Estimating a user's conversational engagement from eye gaze
Special issue on interaction with smart objects, Special section on eye gaze and conversationIn face-to-face conversations, speakers are continuously checking whether the listener is engaged in the conversation, and they change their conversational strategy if the listener is not fully engaged. With the goal of building a conversational agent ...
Gaze-communicative behavior of stuffed-toy robot with joint attention and eye contact based on ambient gaze-tracking
ICMI '07: Proceedings of the 9th international conference on Multimodal interfacesThis paper proposes a gaze-communicative stuffed-toy robot system with joint attention and eye-contact reactions based on ambient gaze-tracking. For free and natural interaction, we adopted our remote gaze-tracking method. Corresponding to the user's ...
Comments