ABSTRACT
Eye tracking can be used to infer what is relevant to a user, and adapt the content and appearance of an application to support the user in their current task. A prerequisite for integrating such adaptive user interfaces into public terminals is robust gaze estimation. Commercial eye trackers are highly accurate, but require prior person-specific calibration and a relatively stable head position. In this paper, we collect data from 26 authentic customers of a fast food restaurant while interacting with a total of 120 products on a self-order terminal. From our observations during the experiment and a qualitative analysis of the collected gaze data, we derive best practice approaches regarding the integration of eye tracking software into self-service systems. We evaluate several implicit calibration strategies that derive the user’s true focus of attention either from the context of the user interface, or from their interaction with the system. Our results show that the original gaze estimates can be visibly improved by taking into account both contextual and interaction-based information.
- 2019. Combining Virtual Reality and Mobile Eye Tracking to Provide a Naturalistic Experimental Environment for Shopper Research. Journal of Business Research 100 (2019), 445–458. https://doi.org/10.1016/j.jbusres.2017.09.028Google ScholarCross Ref
- Fares Alnajar, Theo Gevers, Roberto Valenti, and Sennay Ghebreab. 2013. Calibration-free gaze estimation using human gaze patterns. In Proceedings of the IEEE International Conference on Computer Vision(ICCV ’13). IEEE, 137–144. https://doi.org/10.1109/ICCV.2013.24Google ScholarDigital Library
- Simone Benedetto, Marco Pedrotti, Luca Minin, Thierry Baccino, Alessandra Re, and Roberto Montanari. 2011. Driver workload and eye blink duration. Transportation Research Part F Traffic Psychology and Behavior (2011), 199–208. https://doi.org/10.1016/j.trf.2010.12.001Google ScholarCross Ref
- Anna Rita Bentivoglio, Susan B Bressman, Emanuele Cassetta, Donatella Carretta, Pietro Tonali, and Albert Albanese. 1997. Analysis of blink rate patterns in normal subjects. Movement Disorders 12, 6 (1997), 1028–1034.Google ScholarCross Ref
- Anthony Bigornia. 2015. IBM/Facebook Partnership - Making Irrelevant Ads a Thing of the Past. https://www.ibm.com/blogs/insights-on-business/consumer-products/ibmfacebook-partnership-making-irrelevant-ads-a-thing-of-the-past/Google Scholar
- Richard A. Bolt. 1981. Computer Graphics Volume 15, Number 3 August 1981. Computer Graphics 15, 3 (1981), 263–268. https://doi.org/10.1145/965161.806819Google ScholarDigital Library
- Jixu Chen and Qiang Ji. 2011. Probabilistic gaze estimation without active personal calibration. In IEEE Computer Society Conference on Computer Vision and Pattern Recognition(CVPR’11). IEEE, Providence, RI, USA, 609–616. https://doi.org/10.1109/CVPR.2011.5995675Google ScholarDigital Library
- Shiwei Cheng and Ying Liu. 2012. Eye-Tracking based adaptive user interface: Implicit human-computer interaction for preference indication. Journal on Multimodal User Interfaces 5, 1-2 (2012), 77–84. https://doi.org/10.1007/s12193-011-0064-6Google ScholarCross Ref
- Jesper Clement, Tore Kristensen, and Kjell Grønhaug. 2013. Understanding consumers’ in-store visual perception: The influence of package design features on visual attention. Journal of Retailing and Consumer Services 20, 2 (2013), 234–239. https://doi.org/10.1016/j.jretconser.2013.01.003Google ScholarCross Ref
- Heiko Drewes, Ken Pfeuffer, and Florian Alt. 2019. Time- and space-efficient eye tracker calibration. In Eye Tracking Research and Applications Symposium(ETRA ’19). ACM, Denver, CO, USA. https://doi.org/10.1145/3314111.3319818Google ScholarDigital Library
- Kerstin Gidlöf, Richard Dewhurst, Kenneth Holmqvist, and Annika Wallin. 2013. Using eye tracking to trace a cognitive process: Gaze behaviour during decision making in a natural environment. Journal of Eye Movement Research 6, 1 (2013), 1–14. https://doi.org/10.16910/jemr.6.1.3Google ScholarCross Ref
- Elias Daniel Guestrin and Moshe Eizenman. 2006. General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Transactions on Biomedical Engineering 53, 6 (2006), 1124–1133. https://doi.org/10.1109/TBME.2005.863952Google ScholarCross Ref
- Melanie Heck, Janick Edinger, and Christian Becker. 2019. Gaze-based product filtering: A system for creating adaptive user interfaces to personalize stateless point-of-sale machines. In 32nd Annual ACM Symposium on User Interface Software and Technology(UIST ’19). ACM, New Orleans, LA, USA, 75–77. https://doi.org/10.1145/3332167.3357120Google ScholarDigital Library
- Melanie Heck, Janick Edinger, Jonathan Bünemann, and Christian Beck. 2021. The subconscious director: Dynamically personalizing videos using gaze data. In 26th International Conference on Intelligent User Interfaces(IUI ’21). ACM, College Station, TX, USA, 1–18. https://doi.org/10.1145/3397481.3450679 1Google ScholarDigital Library
- Michael Xuelin Huang, Tiffany C.K. Kwok, Grace Ngai, Hong Va Leong, and Stephen C.F. Chan. 2014. Building a self-learning eye gaze model from user interaction data. In Proceedings of the 2014 ACM Conference on Multimedia(MM ’14). ACM, Orlando, FL, USA, 1017–1020. https://doi.org/10.1145/2647868.2655031Google ScholarDigital Library
- Rupert A. Hurley, Julie Christine Rice, Jerry Koefelda, Robert Congdon, and Andrew Ouzts. 2017. The role of secondary packaging on brand awareness: Analysis of 2 l carbonated soft drinks in resusable shells using eye tracking technology. Packaging Technology and Science 30 (2017), 711–722. https://doi.org/10.1002/pts.2316Google ScholarCross Ref
- Laurent Itti and Christof Koch. 2001. Computational modelling of visual attention. Nature Reviews Neuroscience 2, 3 (2001), 194–203. https://doi.org/10.1038/35058500Google ScholarCross Ref
- Melih Kandemir, Veli-Matti Saarinen, and Samuel Kaski. 2010. Inferring object relevance from gaze in dynamic scenes. In Symposium on Eye-Tracking Research & Applications (ETRA ’10). ACM, 105–108. https://doi.org/10.1145/1743666.1743692Google ScholarDigital Library
- László Kozma, Arto Klami, and Samuel Kaski. 2009. GaZIR: Gaze-based zooming interface for image retrieval. In International Conference on Multimodal Interfaces (ICMI-MLMI ’09). 305–312. https://doi.org/10.1145/1647314.1647379Google ScholarDigital Library
- Greg Linden, Brent Smith, and Jeremy York. 2003. Amazon.com recommendations: Item-to-item collaborative filtering. IEEE Internet Computing 7, 1 (2003), 76–80. https://doi.org/10.1109/MIC.2003.1167344 arxiv:69Google ScholarDigital Library
- Ross G. Macdonald and Benjamin W. Tatler. 2018. Gaze in a real-world social interaction: A dual eye-tracking study. Quarterly Journal of Experimental Psychology 71, 10(2018), 2162–2173. https://doi.org/10.1177/1747021817739221Google ScholarCross Ref
- Yasuto Nakanishi, Takashi Fujii, Kotaro Kiatjima, Yoichi Sato, and Hideki Koike. 2002. Vision-based face tracking system for large displays. Lecture Notes in Computer Science 2498 (2002), 152–159. https://doi.org/10.1007/3-540-45809-3_11Google ScholarCross Ref
- Christina Ohm, Manuel Müller, Bernd Ludwig, and Stefan Bienk. 2014. Where is the landmark? Eye tracking studies in large-scale indoor environments. In CEUR Workshop(ET4S’14). Vienna, Austria, 47–51.Google Scholar
- Tobias Otterbring, Erik Wästlund, and Anders Gustafsson. 2016. Eye-tracking customers’ visual attention in the wild: Dynamic gaze behavior moderates the effect of store familiarity on navigational fluency. Journal of Retailing and Consumer Services 28 (2016), 165–170. https://doi.org/10.1016/j.jretconser.2015.09.004Google ScholarCross Ref
- Tobias Otterbring, Erik Wästlund, Anders Gustafsson, and Poja Shams. 2014. Vision (im)possible? The effects of in-store signage on customers’ visual attention. Journal of Retailing and Consumer Services 21, 5 (2014), 676–684. https://doi.org/10.1016/j.jretconser.2014.05.002Google ScholarCross Ref
- Ken Pfeuffer, Melodie Vidal, Jayson Turner, Andreas Bulling, and Hans Gellersen. 2013. Pursuit calibration: Making gaze calibration less tedious and more flexible. In 26th annual ACM symposium on User interface software and technology(UIST ’13). ACM, St. Andrews, UK, 261–269. https://doi.org/10.1145/2501988.2501998Google ScholarDigital Library
- Pernilla Qvarfordt and Shumin Zhai. 2005. Conversing With the User Based on Eye-gaze Patterns. In Proceedings of the SIGCHI conference on Human factors in computing systems (CHI ’05). 221–230. https://doi.org/10.1145/1054972.1055004Google ScholarDigital Library
- Patrick Renner, Nico Lüdike, Jens Wittrowski, and Thies Pfeiffer. 2011. Towards continuous gaze-based interaction in 3D environments - unobtrusive calibration and accuracy monitoring. In Virtuelle & Erweiterte Realität. Shaker Verlag, Bonn, Germany, 13–24.Google Scholar
- Javier San Agustin, John Paulin Hansen, and Martin Tall. 2010. Gaze-based interaction with public displays using off-the-shelf components. In 12th ACM international conference adjunct papers on Ubiquitous computing(Ubicomp’10). ACM, 377–378. https://doi.org/10.1145/1864431.1864444Google ScholarDigital Library
- India Starker and Richard A. Bolt. 1990. A Gaze-Responsive Self-Disclosing Display. In SIGCHI Conference on Human Factors in Computing Systems Empowering People(CHI ’90). ACM, 3–10. https://doi.org/10.1145/97243.97245Google ScholarDigital Library
- Statista. 2020. Volume of data/ information created worldwide from 2010 to 2024. https://www.statista.com/statistics/871513/worldwide-data-created/Google Scholar
- Yusuke Sugano, Yasuyuki Matsushita, and Yoichi Sato. 2013. Appearance-based gaze estimation using visual saliency. IEEE Transactions on Pattern Analysis and Machine Intelligence 35, 2(2013), 329–341. https://doi.org/10.1109/TPAMI.2012.101Google ScholarDigital Library
- Yusuke Sugano, Yasuyuki Matsushita, Yoichi Sato, and Hideki Koike. 2008. An incremental learning method for unconstrained gaze estimation. Lecture Notes in Computer Science 5304 LNCS, PART 3 (2008), 656–667. https://doi.org/10.1007/978-3-540-88690-7-49Google ScholarCross Ref
- Roel Vertegaal, Jeffrey S. Shell, Daniel Chen, and Aadil Mamuji. 2006. Designing for augmented attention: Towards a framework for attentive user interfaces. Computers in Human Behavior 22, 4 (2006), 771–789. https://doi.org/10.1016/j.chb.2005.12.012Google ScholarCross Ref
- Tore Vesterby, Jonas C. Voss, John Paulin Hansen, Arne John Glenstrup, Dan Witzner Hansen, and Mark Rudolph. 2005. Gaze-Guided Viewing of Interactive Movies. Digital Creativity 16, 4 (2005), 193–204. https://doi.org/10.1080/14626260500476523Google ScholarCross Ref
- Kang Wang, Shen Wang, and Qiang Ji. 2016. Deep eye fixation map learning for calibration-free eye gaze tracking. In Eye Tracking Research and Applications Symposium(ETRA’16). ACM, Charleston, SC, USA, 47–55. https://doi.org/10.1145/2857491.2857515Google ScholarDigital Library
- Erik Wästlund, Tobias Otterbring, Anders Gustafsson, and Poja Shams. 2015. Heuristics and resource depletion: Eye-tracking customers’ in situ gaze behavior in the field. Journal of Business Research 68, 1 (2015), 95–101. https://doi.org/10.1016/j.jbusres.2014.05.001Google ScholarCross Ref
- Songhua Xu, Hao Jiang, and Francis C.M. Lau. 2008. Personalized Online Document, Image and Video Recommendation via Commodity Eye-tracking. In Proceedings of the 2008 ACM Conference on Recommender Systems (RecSys ’08). 83–90. https://doi.org/10.1145/1454008.1454023 arxiv:1510.03706Google ScholarDigital Library
Index Terms
- Conditioning Gaze-Contingent Systems for the Real World: Insights from a Field Study in the Fast Food Industry
Recommendations
Pursuit calibration: making gaze calibration less tedious and more flexible
UIST '13: Proceedings of the 26th annual ACM symposium on User interface software and technologyEye gaze is a compelling interaction modality but requires user calibration before interaction can commence. State of the art procedures require the user to fixate on a succession of calibration markers, a task that is often experienced as difficult and ...
Beyond gaze: preliminary analysis of pupil dilation and blink rates in an fMRI study of program comprehension
EMIP '18: Proceedings of the Workshop on Eye Movements in ProgrammingResearchers have been employing psycho-physiological measures to better understand program comprehension, for example simultaneous fMRI and eye tracking to validate top-down comprehension models. In this paper, we argue that there is additional value in ...
Low-cost head position tracking for gaze point estimation
PETRA '12: Proceedings of the 5th International Conference on PErvasive Technologies Related to Assistive EnvironmentsIn this paper, we present a low-cost solution for real-time tracking of a human user's head position with respect to a video display source for eye gaze estimation in an assistive setting. The solution utilizes a wearable headset equipped with sensors ...
Comments