skip to main content
10.1145/3204493.3204536acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article
Best Paper

Error-aware gaze-based interfaces for robust mobile gaze interaction

Authors Info & Claims
Published:14 June 2018Publication History

ABSTRACT

Gaze estimation error can severely hamper usability and performance of mobile gaze-based interfaces given that the error varies constantly for different interaction positions. In this work, we explore error-aware gaze-based interfaces that estimate and adapt to gaze estimation error on-the-fly. We implement a sample error-aware user interface for gaze-based selection and different error compensation methods: a naïve approach that increases component size directly proportional to the absolute error, a recent model by Feit et al. that is based on the two-dimensional error distribution, and a novel predictive model that shifts gaze by a directional error estimate. We evaluate these models in a 12-participant user study and show that our predictive model significantly outperforms the others in terms of selection rate, particularly for small gaze targets. These results underline both the feasibility and potential of next generation error-aware gaze-based user interfaces.

References

  1. Stanislavs Bardins, Tony Poitschke, and Stefan Kohlbecher. 2008. Gaze-based Interaction in Various Environments. In Proceedings of the 1st ACM Workshop on Vision Networks for Behavior Analysis (VNBA '08). ACM, New York, NY, USA, 47--54. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Michael Barz, Andreas Bulling, and Florian Daiber. 2016. Prediction of Gaze Estimation Error for Error-Aware Gaze-Based Interfaces. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '16). ACM, New York, NY, USA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Michael Barz, Peter Poller, and Daniel Sonntag. 2017. Evaluating Remote and Head-worn Eye Trackers in Multi-modal Speech-based HRI. In Proceedings of the Companion of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, Bilge Mutlu, Manfred Tscheligi, Astrid Weiss, and James E Young (Eds.). ACM, New York, NY, USA, 79--80. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Michael Barz and Daniel Sonntag. 2016. Gaze-guided object classification using deep neural networks for attention-based computing. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing Adjunct - UbiComp '16. ACM Press, New York, New York, USA, 253--256. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Pieter Blignaut, Kenneth Holmqvist, Marcus Nyström, and Richard Dewhurst. 2014. Improving the Accuracy of Video-Based Eye-Tracking in Real-Time through Post-Calibration Regression. Springer, 77--100.Google ScholarGoogle Scholar
  6. Pieter Blignaut and Daniël Wium. 2013. The Effect of Mapping Function on the Accuracy of a Video-based Eye Tracker. In Proceedings of the 2013 Conference on Eye Tracking South Africa (ETSA '13). ACM, New York, NY, USA, 39--46. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Jurek Breuninger, Christian Lange, and Klaus Bengler. 2011. Implementing Gaze Control for Peripheral Devices. In Proceedings of the 1st International Workshop on Pervasive Eye Tracking & Mobile Eye-based Interaction (PETMEI '11). ACM, New York, NY, USA, 3--8. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Andreas Bulling, Daniel Roggen, and Gerhard Tröster. 2008. EyeMote --- Towards Context-Aware Gaming Using Eye Movements Recorded from Wearable Electrooculography. In Proceedings of the 2Nd International Conference on Fun and Games. Springer-Verlag, Berlin, Heidelberg, 33--45. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Géry Casiez, Nicolas Roussel, and Daniel Vogel. 2012. 1 Euro Filter: A Simple Speed-based Low-pass Filter for Noisy Input in Interactive Systems. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '12). ACM, New York, NY, USA, 2527--2530. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Juan J. Cerrolaza, Arantxa Villanueva, Maria Villanueva, and Rafael Cabeza. 2012. Error Characterization and Compensation in Eye Tracking Systems. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12). ACM, New York, NY, USA, 205--208. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Jan Drewes, Guillaume S. Masson, and Anna Montagnini. 2012. Shifts in Reported Gaze Position Due to Changes in Pupil Size: Ground Truth and Compensation. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12). ACM, New York, NY, USA, 209--212. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015. Orbits. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology - UIST '15. ACM Press, New York, New York, USA, 457--466. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Anna Maria Feit, Shane Williams, Arturo Toledo, Ann Paradiso, Harish Kulkarni, Shaun Kane, and Meredith Ringel Morris. 2017. Toward Everyday Gaze Input: Accuracy and Precision of Eye Tracking and Implications for Design. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems - CHI '17. ACM Press, New York, New York, USA, 1118--1130. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Jeremy Hales, David Rozado, and Diako Mardanbegi. 2011. Interacting with Objects in the Environment by Gaze and Hand Gestures. In Proceedings of the 3rd International Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction. 1--9.Google ScholarGoogle Scholar
  15. Robert J. K. Jacob. 1991. The Use of Eye Movements in Human-computer Interaction Techniques: What You Look at is What You Get. ACM Trans. Inf. Syst. 9, 2 (April 1991), 152--169. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Samuel John, Erik Weitnauer, and Hendrik Koesling. 2012. Entropy-based Correction of Eye Tracking Data for Static Scenes. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12). ACM, New York, NY, USA, 297--300. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication (UbiComp '14 Adjunct). ACM, New York, NY, USA, 1151--1160. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Christian Lander, Sven Gehring, Antonio Krüger, Sebastian Boring, and Andreas Bulling. 2015a. GazeProjector: Accurate Gaze Estimation and Seamless Gaze Interaction Across Multiple Displays. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '15). ACM, New York, NY, USA, 395--404. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Christian Lander, Marco Speicher, Denise Paradowski, Norine Coenen, Sebastian Biewer, and Antonio Krüger. 2015b. Collaborative Newspaper: Exploring an Adaptive Scrolling Algorithm in a Multi-user Reading Scenario. In Proceedings of the 4th International Symposium on Pervasive Displays (PerDis '15). ACM, New York, NY, USA, 163--169. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Diako Mardanbegi and Dan Witzner Hansen. 2011. Mobile Gaze-based Screen Interaction in 3D Environments. In Proceedings of the 1st Conference on Novel Gaze-Controlled Applications (NGCA '11). ACM, New York, NY, USA, Article 2, 4 pages. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Diako Mardanbegi and Dan Witzner Hansen. 2012. Parallax Error in the Monocular Head-mounted Eye Trackers. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing (UbiComp '12). ACM, New York, NY, USA, 689--694. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Darius Miniotas, Oleg Špakov, and I. Scott MacKenzie. 2004. Eye Gaze Interaction with Expanding Targets. In CHI '04 Extended Abstracts on Human Factors in Computing Systems (CHI EA '04). ACM, New York, NY, USA, 1255--1258. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. A. Monden, K. Matsumoto, and M. Yamato. 2005. Evaluation of Gaze-Added Target Selection Methods Suitable for General GUIs. Int. J. Comput. Appl. Technol. 24, 1 (June 2005), 17--24. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Marcus Nyström, Richard Andersson, Kenneth Holmqvist, and Joost van de Weijer. 2013. The influence of calibration method and eye physiology on eyetracking data quality. Behavior Research Methods 45, 1 (2013), 272--288.Google ScholarGoogle ScholarCross RefCross Ref
  25. F. Pedregosa, G. Varoquaux, A. Gramfort, V. Michel, B. Thirion, O. Grisel, M. Blondel, P. Prettenhofer, R. Weiss, V. Dubourg, J. Vanderplas, A. Passos, D. Cournapeau, M. Brucher, M. Perrot, and E. Duchesnay. 2011. Scikit-learn: Machine Learning in Python. Journal of Machine Learning Research 12 (2011), 2825--2830. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Jeffrey S. Shell, Roel Vertegaal, and Alexander W. Skaburskis. 2003. EyePliances: Attention-seeking Devices That Respond to Visual Attention. In CHI '03 Extended Abstracts on Human Factors in Computing Systems (CHI EA '03). ACM, New York, NY, USA, 770--771. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Daniel Sonntag. 2015. Kognit: Intelligent Cognitive Enhancement Technology by Cognitive Models and Mixed Reality for Dementia Patients. (2015). https://www.aaai.org/ocs/index.php/FSS/FSS15/paper/view/11702Google ScholarGoogle Scholar
  28. Sophie Stellmach and Raimund Dachselt. 2012. Look & Touch: Gaze-supported Target Acquisition. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '12). ACM, New York, NY, USA, 2981--2990. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Sophie Stellmach, Sebastian Stober, Andreas Nürnberger, and Raimund Dachselt. 2011. Designing Gaze-supported Multimodal Interactions for the Exploration of Large Image Collections. In Proceedings of the 1st Conference on Novel Gaze-Controlled Applications (NGCA '11). ACM, New York, NY, USA, Article 1, 8 pages. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Takumi Toyama, Thomas Kieninger, Faisal Shafait, and Andreas Dengel. 2012. Gaze Guided Object Recognition Using a Head-mounted Eye Tracker. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12). ACM, New York, NY, USA, 91--98. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. Takumi Toyama and Daniel Sonntag. 2015. Towards episodic memory support for dementia patients by recognizing objects, faces and text in eye gaze. KI 2015: Advances in Artificial Intelligence 9324 (2015), 316--323.Google ScholarGoogle ScholarCross RefCross Ref
  32. Jayson Turner, Andreas Bulling, Jason Alexander, and Hans Gellersen. 2014. Cross-device Gaze-supported Point-to-point Content Transfer. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '14). ACM, New York, NY, USA, 19--26. Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013. Pursuits: Spontaneous Interaction with Displays Based on Smooth Pursuit Eye Movement and Moving Targets. In Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '13). ACM, New York, NY, USA, 439--448. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. Oleg Špakov. 2011. Comparison of Gaze-to-objects Mapping Algorithms. In Proceedings of the 1st Conference on Novel Gaze-Controlled Applications (NGCA '11). ACM, New York, NY, USA, Article 6, 8 pages. Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. Oleg Špakov. 2012. Comparison of Eye Movement Filters Used in HCI. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12). ACM, New York, NY, USA, 281--284. Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Oleg Špakov and Yulia Gizatdinova. 2014. Real-time Hidden Gaze Point Correction. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '14). ACM, New York, NY, USA, 291--294. Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. Lawrence H Yu and E Eizenman. 2004. A new methodology for determining point-of-gaze in head-mounted eye tracking systems. Biomedical Engineering, IEEE Transactions on 51, 10 (Oct 2004), 1765--1773.Google ScholarGoogle ScholarCross RefCross Ref
  38. Xinyong Zhang, Xiangshi Ren, and Hongbin Zha. 2008. Improving Eye Cursor's Stability for Eye Pointing Tasks. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '08). ACM, New York, NY, USA, 525--534. Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. Yanxia Zhang, Andreas Bulling, and Hans Gellersen. 2013. SideWays: A Gaze Interface for Spontaneous Interaction with Situated Displays. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). ACM, New York, NY, USA, 851--860. Google ScholarGoogle ScholarDigital LibraryDigital Library
  40. Yunfeng Zhang and Anthony J Hornof. 2011. Mode-of-disparities error correction of eye-tracking data. Behavior research methods 43, 3 (September 2011), 834--842.Google ScholarGoogle Scholar
  41. Yunfeng Zhang and Anthony J. Hornof. 2014. Easy Post-hoc Spatial Recalibration of Eye Tracking Data. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '14). ACM, New York, NY, USA, 95--98. Google ScholarGoogle ScholarDigital LibraryDigital Library
  42. Yanxia Zhang, Jörg Müller, Ming Ki Chong, Andreas Bulling, and Hans Gellersen. 2014. GazeHorizon: Enabling Passers-by to Interact with Public Displays by Gaze. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '14). ACM, New York, NY, USA, 559--563. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Error-aware gaze-based interfaces for robust mobile gaze interaction

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        ETRA '18: Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications
        June 2018
        595 pages
        ISBN:9781450357067
        DOI:10.1145/3204493

        Copyright © 2018 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 14 June 2018

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Acceptance Rates

        Overall Acceptance Rate69of137submissions,50%

        Upcoming Conference

        ETRA '24
        The 2024 Symposium on Eye Tracking Research and Applications
        June 4 - 7, 2024
        Glasgow , United Kingdom

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader