ABSTRACT
Gaze tracking in 3D has the potential to improve interaction with objects and visualizations in augmented reality. However, previous research showed that subjective perception of distance varies between real and virtual surroundings. We wanted to determine whether objectively measured 3D gaze depth through eye tracking also exhibits differences between entirely real and augmented environments. To this end, we conducted an experiment (N = 25) in which we used Microsoft HoloLens with a binocular eye tracking add-on from Pupil Labs. Participants performed a task that required them to look at stationary real and virtual objects while wearing a HoloLens device. We were not able to find significant differences in the gaze depth measured by eye tracking. Finally, we discuss our findings and their implications for gaze interaction in immersive analytics, and the quality of the collected gaze data.
- W. Büschel, J. Chen, R. Dachselt, S. Drucker, T. Dwyer, C. Görg, T. Isenberg, A. Kerren, C. North, and W. Stuerzlinger. 2018. Interaction for immersive analytics. In Immersive Analytics. Springer International Publishing, 95–138.Google Scholar
- D. Drascic and P. Milgram. 1996. Perceptual issues in augmented reality. In Stereoscopic Displays and Virtual Reality Systems III, Vol. 2653. International Society for Optics and Photonics, 123–134.Google Scholar
- A. T. Duchowski, D. H. House, J. Gestring, R. Congdon, L. Świrski, N. A. Dodgson, K. Krejtz, and I. Krejtz. 2014. Comparing estimated gaze depth in virtual and physical environments. In Proceedings of the 8th ACM Symposium on Eye Tracking Research & Applications, ETRA 2014. ACM, 103–110.Google ScholarDigital Library
- A. T. Duchowski, B. Pelfrey, D. H. House, and R. Wang. 2011. Measuring gaze depth with an eye tracker during stereoscopic display. In Proceedings of the ACM SIGGRAPH Symposium on Applied Perception in Graphics and Visualization, APGV 2011. ACM, 15–22.Google ScholarDigital Library
- P. V. Johnson, J. AQ. Parnell, J. Kim, C. D. Saunter, G. D. Love, and M. S. Banks. 2016. Dynamic lens and monovision 3D displays to improve viewer comfort. Optics Express 24, 11 (May 2016), 11808–11827.Google ScholarCross Ref
- J. W. Kelly, L. A. Cherep, and Z. D. Siegel. 2017. Perceived space in the HTC Vive. ACM Transactions on Applied Perception 15, 1, Article 2 (Jul 2017), 16 pages.Google ScholarDigital Library
- K. Kurzhals, B. D. Fisher, M. Burch, and D. Weiskopf. 2014. Evaluating visual analytics with eye tracking. In Proceedings of the Fifth Workshop on Beyond Time and Errors: Novel Evaluation Methods for Visualization, BELIV 2014. ACM, 61–69.Google ScholarDigital Library
- Y. Lee, C. Shin, A. Plopski, Y. Itoh, T. Piumsomboon, A. Dey, G. Lee, S. Kim, and M. Billinghurst. 2017. Estimating gaze depth using multi-layer perceptron. In Proceedings of the 2017 International Symposium on Ubiquitous Virtual Reality, ISUVR 2017. IEEE, 26–29.Google Scholar
- K. Marriott, F. Schreiber, T. Dwyer, K. Klein, N. H. Riche, T. Itoh, W. Stuerzlinger, and B. H. Thomas. 2018. Immersive Analytics. Springer International Publishing.Google Scholar
- Pupil Labs GmbH. 2020. Pupil Labs | Pupil Invisible - The world’s first deep learning powered eye tracking glasses. https://pupil-labs.com/products/invisible/. Accessed: 2020-02-17.Google Scholar
- J. P. Rolland, W. Gibson, and D. Ariely. 1995. Towards quantifying depth and size perception in virtual environments. Presence: Teleoperators & Virtual Environments 4, 1 (Winter 1995), 24–49.Google ScholarDigital Library
- J. P. Rolland, C. Meyer, K. Arthur, and E. Rinalducci. 2002. Method of adjustments versus method of constant stimuli in the quantification of accuracy and precision of rendered depth in head-mounted displays. Presence: Teleoperators & Virtual Environments 11, 6 (Dec 2002), 610–625.Google ScholarDigital Library
- N. Silva, T. Blascheck, R. Jianu, N. Rodrigues, D. Weiskopf, M. Raubal, and T. Schreck. 2019. Eye tracking support for visual analytics systems: foundations, current applications, and research challenges. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications, ETRA 2019. ACM, 11:1–11:10.Google Scholar
- S. Stellmach and Microsoft Corporation. 2020. Eye tracking - Mixed Reality | Microsoft Docs. https://docs.microsoft.com/en-us/windows/mixed-reality/eye-tracking. Accessed: 2020-02-17.Google Scholar
- J. E. Swan, G. Singh, and S. R. Ellis. 2015. Matching and reaching depth judgments with real and augmented reality targets. IEEE Transactions on Visualization and Computer Graphics 21, 11 (Nov 2015), 1289–1298.Google ScholarDigital Library
- Y. Wang, G. Zhai, S. Chen, X. Min, Z. Gao, and X. Song. 2019. Assessment of eye fatigue caused by head-mounted displays using eye-tracking. BioMedical Engineering OnLine 18, 1 (Nov 2019), 111.Google ScholarCross Ref
- Evaluation of Gaze Depth Estimation from Eye Tracking in Augmented Reality
Recommendations
The Eye in Extended Reality: A Survey on Gaze Interaction and Eye Tracking in Head-worn Extended Reality
With innovations in the field of gaze and eye tracking, a new concentration of research in the area of gaze-tracked systems and user interfaces has formed in the field of Extended Reality (XR). Eye trackers are being used to explore novel forms of spatial ...
Eye vs. Head: Comparing Gaze Methods for Interaction in Augmented Reality
ETRA '20 Short Papers: ACM Symposium on Eye Tracking Research and ApplicationsVisualization in virtual 3D environments can provide a natural way for users to explore data. Often, arm and short head movements are required for interaction in augmented reality, which can be tiring and strenuous though. In an effort toward more user-...
Snap, Pursuit and Gain: Virtual Reality Viewport Control by Gaze
CHI '24: Proceedings of the CHI Conference on Human Factors in Computing SystemsHead-mounted displays let users explore virtual environments through a viewport that is coupled with head movement. In this work, we investigate gaze as an alternative modality for viewport control, enabling exploration of virtual worlds with less head ...
Comments