Skip to main content
Log in

Attentive interfaces for users with disabilities: eye gaze for intention and uncertainty estimation

  • Long Paper
  • Published:
Universal Access in the Information Society Aims and scope Submit manuscript

Abstract

Attentive user interfaces (AUIs) capitalize on the rich information that can be obtained from users’ gaze behavior in order to infer relevant aspects of their cognitive state. Not only is eye gaze an excellent clue to states of interest and intention, but also to preference and confidence in comprehension. AUIs are built with the aim of adapting the interface to the user’s current information need, and thus reduce workload of interaction. Given those characteristics, it is believed that AUIs can have particular benefits for users with severe disabilities, for whom operating a physical device (like a mouse pointer) might be very strenuous or infeasible. This paper presents three studies that attempt to gauge uncertainty and intention on the part of the user from gaze data, and compare the success of each approach. The paper discusses how the application of the approaches adopted in each study to user interfaces can support users with severe disabilities.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Notes

  1. http://research.nii.ac.jp/~prendinger/GALA2006/.

  2. The ProComp InfinityTM encoder from [50] was used to process the bio-signals skin conductance and blood volume pulse. Bio-signal processing is mentioned for the sake of completeness, but will not be further discussed here (see [51] for a detailed discussion).

References

  1. Bratman, M.E.: Intentions, Plans, and Practical Reason. Harvard University Press, Cambridge (1987)

    Google Scholar 

  2. Bratman, M.E.: What is intention? In: Cohen, P.R., Morgan, J.L., Pollack, M.E. (eds.) Intentions in Communication, pp. 15–32. The MIT Press, Cambridge (1990)

    Google Scholar 

  3. Cohen, P.R., Levesque, H.J.: Intention is choice with commitment. Artif. Intell. 42(2–3), 213–261 (1990)

    Article  MATH  MathSciNet  Google Scholar 

  4. Shell, J.S., Selker, T., Vertegaal, R.: Interacting with groups of computers. Commun. ACM 46(3), 40–46 (2003)

    Article  Google Scholar 

  5. Zhai, S.: What’s in the eyes for attentive input. Commun. ACM 46(3), 34–39 (2003)

    Article  Google Scholar 

  6. Horvitz, E., Kadie, C.M., Paek, T., Hovel, D.: Models of attention in computing and communications: from principles to applications. Commun. ACM 46(3), 52–59 (2003)

    Article  Google Scholar 

  7. Jeffrey, R.E.: The Logic of Decision. The University of Chicago Press, Chicago, 2nd edn (1983)

  8. Majaranta, P., Räihä, K.-J.: Twenty years of eye typing: systems and design issues. In: Proceedings of the symposium on eye tracking research and applications (ETRA-02), pp. 15–22. ACM Press, 2002

  9. Jacob, R.J.K.: The use of eye movements in human–computer interaction techniques: what you look at is what you get. ACM Trans. Inf. Syst. 9(3), 152–169 (1991)

    Article  Google Scholar 

  10. Vertegaal, R.: Designing attentive interfaces. In: Proceedings of the symposium on eye tracking research and applications (ETRA-02), pp. 22–30. ACM Press, 2002

  11. Hyrskykari, A., Majaranta, P., Räihä, K.-J.: From gaze control to attentive interfaces. In: Proceedings of HCII 2005. Erlbaum, 2005

  12. Starker, I., Bolt, R.A.: A gaze-responsive self-disclosing display. In: Proceedings ACM CHI 1990 conference on human factors in computing systems, pp. 3–9. ACM Press, 1990

  13. Qvarfordt, P., Zhai, S.: Conversing with the user based on eye-gaze patterns. In: Proceedings of the ACM CHI 2005 conference on human factors in computing systems, pp. 221–230. ACM Press, 2005

  14. Hansen, J.P., Andersen, A.W., Roed, P.: Eye-gaze control of multimedia systems. In: Proceedings 6th international conference on human–computer interaction (HCI-95), pp. 151–190. Elsevier, 1995

  15. Andreassi, J.L.: Psychophysiology. Human Behavior and Physiological Response, 4th edn. Lawrence Erlbaum Associates, Mahwah (2000)

    Google Scholar 

  16. Oyekoya, O.K., Stentiford, F.W.M.: Eye tracking as a new interface for image retrieval. Br. Telecommun. Technol. J. 22(3), 161–169 (2004)

    Google Scholar 

  17. Selker, T.: Visual attentive interfaces. BT. Technol. J. 22(4), 146–150 (2004)

  18. Bee, N., Prendinger, H., Nakasone, A., André, E., Ishizuka, M.: Automatic preference detection by analyzing the gaze ‘cascade effect’. In: Electronic Proceedings of the 2nd conference on communication by gaze interaction, COGAIN 2006: gazing into the future, pp. 61–64, 2006

  19. Johnson, P., Johnson, H., Waddington, R., Shouls, A.: Task-related knowledge structures: analysis, modelling and application. In: Proceedings of the Fourth Conference of the British Computer Society on People and Computers IV, pp. 35–62. University of Manchester, UK (1988)

  20. Card, S.K., Moran,T.P., Newell, A.: The Psychology of Human–Computer Interaction. Lawrence Erlbaum Associates, New Jersey (1983)

  21. Rudmann, D.S., McConkie, G.W., Zheng, X.S.: Speech and gaze: eyetracking in cognitive state detection for HCI. In: Proceedings of the 5th international conference on multimodal interfaces (ICMI-03), pp. 159–163. ACM Press, November 2003

  22. Edwards, G.: A tool for creating eye-aware applications that adapt to changes in user behaviors. In: Proceedings of the third international ACM conference on assistive technologies (Assets-98), pp. 67–74. ACM Press, January 1998

  23. Goldberg, J.H., Schryver, J.C.: Eye-gaze determination of user intent at the computer interface. In: Findlay, J.M., Walker, R., Kentridge, R. (eds.) Eye Movement Research: Mechanisms, Processes, and Applications. Elsevier, Amsterdam (1995)

    Google Scholar 

  24. Smith, J.D., Vertegaal, R., Sohn, C.: Viewpointer: lightweight calibration-free eye tracking for ubiquitous handsfree deixis. In: Proceedings of the 18th annual ACM symposium on user interface software and technology (UIST 2005), pp. 53–61. ACM Press, 2005

  25. Nijholt, A., Heylen, D., Vertegaal, R.: Inhabited interfaces: attentive conversational agents that help. In: Proceedings 3rd international conference on disability, virtual reality and associated technologies (ICDVRAT 2000), ISBN 0-7049-11-42-6, pp. 225–230, Alghero, Sardinia, Italy, September 2000

  26. Prendinger, H., Ma, C., Yingzi, J., Nakasone, A., Ishizuka, M.: Understanding the effect of life-like interface agents through eye users’ eye movements. In: Proceedings of seventh international conference on multimodal interfaces (ICMI-05), pp. 108–115. ACM Press, 2005

  27. Eichner, T., Prendinger, H., André, E., Ishizuka, M.: Attentive presentation agents. In: Proceedings 7th international conference on intelligent virtual agents (IVA-07), pp. 283–295. Springer LNCS 4722, 2007

  28. Kendon, A.: Some functions of gaze-direction in social interaction. Acta. Psychol. 26, 22–63 (1967)

    Article  Google Scholar 

  29. Velichkovsky, B.M., Hansen, J.P.: New technological windows into mind: there is more in eyes and brains for human–computer interaction. In: Proceedings ACM CHI 1996 conference on human factors in computing systems, pp. 496–503. ACM Press, 1996

  30. Vertegaal, R., Slagter, R., van der Veer, G., Nijholt, A.: Eye gaze patterns in conversations: there is more to conversational agents than meets the eyes. In: Proceedings of ACM CHI 2001 conference on human factors in computing systems, pp. 301–308. ACM Press, 2001

  31. Hyrskykari, A.: Utilizing eye movements: overcoming inaccuracy while tracking the focus of attention during reading. Comput. Hum. Behav. 22(4), 103–117 (2006)

    Article  Google Scholar 

  32. Hyrskykari, A.: Eyes in attentive interfaces: experiences from creating iDict, a Gaze-Aware Reading Aid. Ph.D. thesis, University Of Tampere, 2006. http://acta.uta.fi/pdf/951-44-6643-8.pdf

  33. Brooke, J.: SUS: a quick and dirty usability scale. In: Jordan, P.W., Thomas, B., Weerdmeester, B.A., McClelland, I.L. (eds.) Usability Evaluation in Industry, pp. 189–194. Taylor & Francis, London (1996)

    Google Scholar 

  34. Underwood, G.: Eye fixations on pictures of natural scenes: getting the gist and identifying the components. In: Underwood, G., (ed.) Cognitive Processes in Eye Guidance, pp. 163–187. Oxford University Press, Oxford (2005)

  35. Nakayama, M., Takahasi, M.: An estimation of certainty for multiple choice responses using eye-movement. In: Electronic Proceedings of the 2nd conference on communication by gaze interaction, COGAIN 2006: gazing into the future, pp. 67–72, 2006

  36. Puolamäki, Y., Salojärvi, J., Savia, E., Simola, J., Kaski, S.: Combining eye movements and collaborative filtering for proactive information retrieval. In: Proceedings of ACM-SIGIR 2005, pp. 145–153, 2005

  37. Salojärvi, J., Puolamäki, K., Kaski, S.: Relevance feedback from eye movements for proactive information retrieval. In: Workshop on Processing Sensory Information for Proactive System, 2004

  38. Nac Image Technology, Inc. Eyemark data analysis system, model SP-505, 2003

  39. Japan Society of Vision Science, editor. SHIKAKU JYOUHOU SYORI HANDOBUKKU (Handbook of Visual Information Processing). Asakura shoten, Tokyo, 2001

  40. Choi, Y.S., Mosley, A.D., Stark, L.W.: String editing analysis of human visual search. Optom. Vis. Sci. 72(7), 439–451 (1995)

    Article  Google Scholar 

  41. Nac Image Technology, Inc. EMR-8NL. http://www.eyemark.jp/ (2001)

  42. Shinotsuka, H.: Confidence and accuracy of mental judgments. Jpn. J. Psychol. 63(6), 396–403 (1993)

    Google Scholar 

  43. Hess, E.H.: Pupillometrics: a method of studying mental, emotional and sensory processes. In: Greenfield, N., Sternbach, R. (eds.) Handbook of Psychophysiology, pp. 491–531. Holt, Rinehart & Winston, New York (1972)

    Google Scholar 

  44. Krugman, H.: Some applications of pupil measurement. J. Mark Res. 1, 15–19 (1964)

    Article  Google Scholar 

  45. Shimojo, S., Simion, C., Shimojo, E., Scheier, C.: Gaze bias both reflects and influences preference. Nat. Neurosci. 6(12), 1317–1322 (2003)

    Article  Google Scholar 

  46. Seeing Machines. Seeing Machines, 2005. http://www.seeingmachines.com/

  47. Schultheis, H., Jameson, A.: Assessing cognitive load in adaptive hypermedia systems: physiological and behavioral methods. In: Proceedings adaptive hypermedia and adaptive web-based systems (AH-04), pp. 225–234, Berlin. Springer LNCS 3137, 2004

  48. Simion, C.: Orienting and preference: an enquiry into the mechanisms underlying emotional decision making. Ph.D. thesis, California Institute of Technology, 2005

  49. Istance, H., Hyrskykari, A., Koskinen, D. Bates, R.: Gaze-based attentive user interfaces AUIs to support disabled users: towards a research agenda. In: Electronic Proceedings of the 2nd conference on communication by gaze interaction, COGAIN 2006: gazing into the future, pp. 56–62, 2006

  50. Thought Technology. Thought Technology Ltd., 2005. http://www.thoughttechnology.com

  51. Bee, N., Prendinger, H., Nakasone, A., André, E., Ishizuka, M.: AutoSelect: what you want is what you get. Real-time processing of visual attention and affect. In: Tutorial and Research Workshop on Perception and Interactive Technologies (PIT-06), pp. 40–52. Springer LNCS 4021, 2006

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Aulikki Hyrskykari.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Prendinger, H., Hyrskykari, A., Nakayama, M. et al. Attentive interfaces for users with disabilities: eye gaze for intention and uncertainty estimation. Univ Access Inf Soc 8, 339–354 (2009). https://doi.org/10.1007/s10209-009-0144-5

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10209-009-0144-5

Keywords

Navigation