ABSTRACT
In this paper, we discuss the use of eye-gaze tracking technology for mobile phones. In particular we investigate how gaze interaction can be used to control applications on handheld devices. In contrast to eye-tracking systems for desktop computers, mobile devices imply several problems like the intensity of light for outdoor use and calibration issues. Therefore, we compared two different approaches for controlling mobile phones with the eyes: standard eye-gaze interaction based on the dwell-time method and gaze gestures. Gaze gestures are a new concept, which we think has the potential to overcome many of these problems. We conducted a user study to see whether people are able to interact with applications using these approaches. The results confirm that eye-gaze interaction for mobile phones is attractive for the users and that the gaze gestures are an alternative method for eye-gaze based interaction.
- Dickie, C. et al. 2005. eyeLook: using attention to facilitate mobile media consumption. In Proceedings of the 18th Annual ACM Symposium on User interface Software and Technology (Seattle, WA, USA, October 23 - 26, 2005). UIST '05. ACM Press, New York, NY, 103--106. Google ScholarDigital Library
- Goldberg, D.; Richardson, C. (1993): Touch-typing with a stylus. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems CHI '93. S. 80--87. Google ScholarDigital Library
- Holleis, P. et al. Keystroke-Level Model for Advanced Mobile Phone Interaction. To appear in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems 2007 (CHI '07), San Jose, CA, USA, April/May 2007. Google ScholarDigital Library
- Jacob, R. J. 1990. What you look at is what you get: eye movement-based interaction techniques. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems: Empowering People (Seattle, Washington, United States, April 01 - 05, 1990). J. C. Chew and J. Whiteside, Eds. CHI '90. ACM Press, New York, NY, 11--18. Google ScholarDigital Library
- Majaranta, P. and Räihä, J. Twenty Years of Eye Typing: Systems and Design Issues, Proceedings of Eye Tracking Research and Applications {ETRA2002}, pages 15--22, New Orleans LA, ACM, 2002. Google ScholarDigital Library
- Mankoff, J. et al. Cirrin: a word-level unistroke keyboard for pen input. In Proceedings of the 11th Annual ACM Symposium on User interface Software and Technology (San Francisco, California, United States, November 01 - 04, 1998). UIST '98. ACM Press, New York, NY, 213--214. Google ScholarDigital Library
- Morimoto CH, Koons D, Amir A, Flickner M (1999) Frame-rate pupil detector and gaze tracker. In: Proceedings of the IEEE ICCV'99 frame-rate workshop.Google Scholar
- Wobbrock, J. O., Myers, B. A., and Kembel, J. A.: EdgeWrite: a stylus-based text entry method designed for high accuracy and stability of motion. UIST '03. (2003), 61--70. Google ScholarDigital Library
Index Terms
- Eye-gaze interaction for mobile phones
Recommendations
Gaze input for mobile devices by dwell and gestures
ETRA '12: Proceedings of the Symposium on Eye Tracking Research and ApplicationsThis paper investigates whether it is feasible to interact with the small screen of a smartphone using eye movements only. Two of the most common gaze-based selection strategies, dwell time selections and gaze gestures are compared in a target selection ...
Evaluation of eye gaze interaction
CHI '00: Proceedings of the SIGCHI conference on Human Factors in Computing SystemsEye gaze interaction can provide a convenient and natural addition to user-computer dialogues. We have previously reported on our interaction techniques using eye gaze [10]. While our techniques seemed useful in demonstration, we now investigate their ...
Compensation of head movements in mobile eye-tracking data using an inertial measurement unit
UbiComp '14 Adjunct: Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct PublicationAnalysis of eye movements recorded with a mobile eye-tracker is difficult since the eye-tracking data are severely affected by simultaneous head and body movements. Automatic analysis methods developed for remote-, and tower-mounted eye-trackers do not ...
Comments