Abstract
We developed and studied an experimental system, RealTourist, which lets a user to plan a conference trip with the help of a remote tourist consultant who could view the tourist’s eye-gaze superimposed onto a shared map. Data collected from the experiment were analyzed in conjunction with literature review on speech and eye-gaze patterns. This inspective, exploratory research identified various functions of gaze-overlay on shared spatial material including: accurate and direct display of partner’s eye-gaze, implicit deictic referencing, interest detection, common focus and topic switching, increased redundancy and ambiguity reduction, and an increase of assurance, confidence, and understanding. This study serves two purposes. The first is to identify patterns that can serve as a basis for designing multimodal human-computer dialogue systems with eye-gaze locus as a contributing channel. The second is to investigate how computer-mediated communication can be supported by the display of the partner’s eye-gaze.
Chapter PDF
Similar content being viewed by others
Keywords
- Joint Attention
- Task Completion Time
- Computer Support Cooperative Work
- Computer Mediate Communication
- Conversational Partner
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Altmann, G.T., Kamide, Y.: Incremental interpretation at verbs: Restricting the domain of subsequent references. Cognition 73, 247–264 (1999)
Argyle, M., Cook, M.: Gaze and Mutual Gaze. Cambridge University Press, Cambridge (1976)
Argyle, M., Graham, J.: The central Europe experiment - Looking at persons and looking at things. Journal of Environmental Psychology and Nonverbal Behaviour 1, 6–16 (1977)
Blank-for-blind-review. Conversing with the User Based on Eye-Gaze Patterns. blank for review (to appear)
Boyle, E.A., Anderson, A.H., Newlans, A.: The effect of visibility on dialogue and performance in a cooperative problem solving task. Language & Speech 37(1), 1–20 (1994)
Buxton, W.A.S., Moran, T.P.: EuroPARC’s integrated interactive intermedia facility (iiif): Early experience. In: Gibbs, S., Verrijn-Stuart, A.A. (eds.) Multi-User Interfaces and Applications, pp. 11–34. Elsevier, Amsterdam (1990)
Clark, H.C., Schaeffer, E.F.: Contributing to discourse. Cognitive Science 13, 259–294 (1989)
Clark, H.H., Krych, M.A.: Speaking while monitoring addresses for understanding. Journal of Memory and Language 50, 62–81 (2004)
Cooper, R.M.: The control of eye fixation by the meaning of spoken language - a new methodology for the real-time investigation of speech perception, memory, and language processing. Cognitive Psychology 6, 84–107 (1974)
Doherty-Sneddon, G., Anderson, A., O’Malley, C., Langton, S., Garrod, S., Bruce, V.: Face-to-face and video mediated communication: A comparison of dialogue structure and task performance. Journal of Experimental Psychology: Applied 3, 105–125 (1997)
Griffin, Z.M., Bock, K.: What the eye says about speaking. Psychological Science 11(4), 274–279 (2000)
Ishii, H., Kobayashi, M.: ClearBoard: A Seamless Media for Shared Drawing and Conversation with Eye-Contact. In: Proc. ACM CHI Conference on Human Factors in Computing Systems, pp. 525–532 (1992)
Jacob, R.J.K.: The Use of Eye Movements in Human-Computer Interaction Techniques: What You Look At is What You Get. ACM Transactions on Information Systems 9(3), 152–169 (1991)
Kamide, Y., Altman, G.T.M., Haywood, S.L.: The time-course of prediction in incremental sentence processing: Evidence from anticipatory eye movements. Journal of Memory and Language 49, 133–156 (2003)
Kaur, M., Tremaine, M., Huang, N., Wilder, J., Gacovski, Z., Flippo, F., Mantravadi, C.S.: Where is it? Event synchronization in gaze-speech input systems. In: Proc. Fifth International Conference on Multimodal Interfaces, pp. 151–157 (2003)
Kraut, R.E., Gergle, D., Fussell, S.R.: The use of visual information in shared visual spaces: Informing the development of virtual co-presence. In: Proc. ACM Conference on Computer Supported Cooperative Work (CSCW), pp. 31–40 (2002)
Meyer, A.S., Sleiderink, A.M., Levelt, W.J.M.: Viewing and naming objects: Eye movements during noun and phrase production. Cognition 66, B5–B33 (1998)
Monk, A., Gale, C.: A look is worth a thousands word: full gaze awareness in video-mediated conversation. Discourse Processes 33(3), 257–278 (2002)
Ochsman, R.B., Chapanis, A.: The effects of 10 communication modes on the behaviour of teams during co-operative problem-solving. International Journal of Man-Machine Studies 6, 579–619 (1974)
Richardson, D.C., Dale, R.: Looking to understand: The coupling between speakers’ and listneners’ eye movement and its relationship to discourse comprehension. In: Proc. the 26th Annual Meeting of the Cognitive Science Society (2004)
Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proc. ACM Eye Tracking Research & Application Symposium (ETRA), pp. 71–79 (2000)
Simon, H.A.: The scientist as problem solver. In: Klahr, D., Kotovsky, K. (eds.) Complex information processing: The impact of Herbert A. Simon, pp. 376–398. Lawrence Erlbaum, Hilsdale (1989)
Tanaka, K.: A robust selection system using real-time multi-modal user-agent interactions. In: Proc. 4th International Conference on Intelligent User Interfaces, pp. 105–108 (1999)
Velichkovsky, B.M.: Communicating attention-gaze position transfer in cooperative problem solving. Pragmatics and Cognition 3(2), 99–224 (1995)
Vertegaal, R.: The GAZE Groupware System: Mediating Joint Attention in Multiparty Communication and Collaboration. In: Proc. CHI 1999: ACM Conference on Human Factors in Computing Systems, pp. 294–301 (1999)
Ware, C., Mikaelian, H.H.: An evaluation of an eye tracker as a device for computer input. In: Proc. CHI+GI: ACM Conference on Human Factors in Computing Systems and Graphics Interface, pp. 183–188 (1987)
Winograd, T., Flores, F.: Understanding computers and cognition. Ablex Publishing Corp., Norwood (1986)
Zhai, S.: What’s in the Eyes for Attentive Input. In: Communications of the ACM, pp. 34–39 (2003)
Zhai, S., Morimoto, C., Ihde, S.: Manual and gaze input cascaded (MAGIC) pointing. In: Proc. CHI 1999: ACM Conference on Human Factors in Computing Systems, pp. 246–253. ACM Press, New York (1999)
Zhang, Q., Imamiya, A., Go, K., Gao, X.: Overriding errors in speech and gaze multimodal architecture. In: Proc. 9th International Conference on Intelligent User Interfaces, pp. 346–348 (2004)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 IFIP International Federation for Information Processing
About this paper
Cite this paper
Qvarfordt, P., Beymer, D., Zhai, S. (2005). RealTourist – A Study of Augmenting Human-Human and Human-Computer Dialogue with Eye-Gaze Overlay. In: Costabile, M.F., Paternò, F. (eds) Human-Computer Interaction - INTERACT 2005. INTERACT 2005. Lecture Notes in Computer Science, vol 3585. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11555261_61
Download citation
DOI: https://doi.org/10.1007/11555261_61
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-28943-2
Online ISBN: 978-3-540-31722-7
eBook Packages: Computer ScienceComputer Science (R0)