skip to main content
10.1145/3379156.3391835acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
short-paper

Evaluation of Gaze Depth Estimation from Eye Tracking in Augmented Reality

Published:02 June 2020Publication History

ABSTRACT

Gaze tracking in 3D has the potential to improve interaction with objects and visualizations in augmented reality. However, previous research showed that subjective perception of distance varies between real and virtual surroundings. We wanted to determine whether objectively measured 3D gaze depth through eye tracking also exhibits differences between entirely real and augmented environments. To this end, we conducted an experiment (N = 25) in which we used Microsoft HoloLens with a binocular eye tracking add-on from Pupil Labs. Participants performed a task that required them to look at stationary real and virtual objects while wearing a HoloLens device. We were not able to find significant differences in the gaze depth measured by eye tracking. Finally, we discuss our findings and their implications for gaze interaction in immersive analytics, and the quality of the collected gaze data.

References

  1. W. Büschel, J. Chen, R. Dachselt, S. Drucker, T. Dwyer, C. Görg, T. Isenberg, A. Kerren, C. North, and W. Stuerzlinger. 2018. Interaction for immersive analytics. In Immersive Analytics. Springer International Publishing, 95–138.Google ScholarGoogle Scholar
  2. D. Drascic and P. Milgram. 1996. Perceptual issues in augmented reality. In Stereoscopic Displays and Virtual Reality Systems III, Vol. 2653. International Society for Optics and Photonics, 123–134.Google ScholarGoogle Scholar
  3. A. T. Duchowski, D. H. House, J. Gestring, R. Congdon, L. Świrski, N. A. Dodgson, K. Krejtz, and I. Krejtz. 2014. Comparing estimated gaze depth in virtual and physical environments. In Proceedings of the 8th ACM Symposium on Eye Tracking Research & Applications, ETRA 2014. ACM, 103–110.Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. A. T. Duchowski, B. Pelfrey, D. H. House, and R. Wang. 2011. Measuring gaze depth with an eye tracker during stereoscopic display. In Proceedings of the ACM SIGGRAPH Symposium on Applied Perception in Graphics and Visualization, APGV 2011. ACM, 15–22.Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. P. V. Johnson, J. AQ. Parnell, J. Kim, C. D. Saunter, G. D. Love, and M. S. Banks. 2016. Dynamic lens and monovision 3D displays to improve viewer comfort. Optics Express 24, 11 (May 2016), 11808–11827.Google ScholarGoogle ScholarCross RefCross Ref
  6. J. W. Kelly, L. A. Cherep, and Z. D. Siegel. 2017. Perceived space in the HTC Vive. ACM Transactions on Applied Perception 15, 1, Article 2 (Jul 2017), 16 pages.Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. K. Kurzhals, B. D. Fisher, M. Burch, and D. Weiskopf. 2014. Evaluating visual analytics with eye tracking. In Proceedings of the Fifth Workshop on Beyond Time and Errors: Novel Evaluation Methods for Visualization, BELIV 2014. ACM, 61–69.Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Y. Lee, C. Shin, A. Plopski, Y. Itoh, T. Piumsomboon, A. Dey, G. Lee, S. Kim, and M. Billinghurst. 2017. Estimating gaze depth using multi-layer perceptron. In Proceedings of the 2017 International Symposium on Ubiquitous Virtual Reality, ISUVR 2017. IEEE, 26–29.Google ScholarGoogle Scholar
  9. K. Marriott, F. Schreiber, T. Dwyer, K. Klein, N. H. Riche, T. Itoh, W. Stuerzlinger, and B. H. Thomas. 2018. Immersive Analytics. Springer International Publishing.Google ScholarGoogle Scholar
  10. Pupil Labs GmbH. 2020. Pupil Labs | Pupil Invisible - The world’s first deep learning powered eye tracking glasses. https://pupil-labs.com/products/invisible/. Accessed: 2020-02-17.Google ScholarGoogle Scholar
  11. J. P. Rolland, W. Gibson, and D. Ariely. 1995. Towards quantifying depth and size perception in virtual environments. Presence: Teleoperators & Virtual Environments 4, 1 (Winter 1995), 24–49.Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. J. P. Rolland, C. Meyer, K. Arthur, and E. Rinalducci. 2002. Method of adjustments versus method of constant stimuli in the quantification of accuracy and precision of rendered depth in head-mounted displays. Presence: Teleoperators & Virtual Environments 11, 6 (Dec 2002), 610–625.Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. N. Silva, T. Blascheck, R. Jianu, N. Rodrigues, D. Weiskopf, M. Raubal, and T. Schreck. 2019. Eye tracking support for visual analytics systems: foundations, current applications, and research challenges. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications, ETRA 2019. ACM, 11:1–11:10.Google ScholarGoogle Scholar
  14. S. Stellmach and Microsoft Corporation. 2020. Eye tracking - Mixed Reality | Microsoft Docs. https://docs.microsoft.com/en-us/windows/mixed-reality/eye-tracking. Accessed: 2020-02-17.Google ScholarGoogle Scholar
  15. J. E. Swan, G. Singh, and S. R. Ellis. 2015. Matching and reaching depth judgments with real and augmented reality targets. IEEE Transactions on Visualization and Computer Graphics 21, 11 (Nov 2015), 1289–1298.Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Y. Wang, G. Zhai, S. Chen, X. Min, Z. Gao, and X. Song. 2019. Assessment of eye fatigue caused by head-mounted displays using eye-tracking. BioMedical Engineering OnLine 18, 1 (Nov 2019), 111.Google ScholarGoogle ScholarCross RefCross Ref
  1. Evaluation of Gaze Depth Estimation from Eye Tracking in Augmented Reality

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        ETRA '20 Short Papers: ACM Symposium on Eye Tracking Research and Applications
        June 2020
        305 pages
        ISBN:9781450371346
        DOI:10.1145/3379156

        Copyright © 2020 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 2 June 2020

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • short-paper
        • Research
        • Refereed limited

        Acceptance Rates

        Overall Acceptance Rate69of137submissions,50%

        Upcoming Conference

        ETRA '24
        The 2024 Symposium on Eye Tracking Research and Applications
        June 4 - 7, 2024
        Glasgow , United Kingdom

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format .

      View HTML Format