skip to main content
10.1145/3334480.3381062acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
abstract

EMICS'20: Eye Movements as an Interface to Cognitive State

Published:25 April 2020Publication History

ABSTRACT

Eye movement recording has been extensively used in HCI and offers the possibility to understand how information is perceived and processed by users. Hardware developments provide the ubiquitous accessibility of eye recording, allowing eye movements to enter common usage as a control modality. Recent A.I. developments provide powerful computational means to make predictions about the user. However, the connection between eye movements and cognitive state has been largely under-exploited in HCI. Despite the rich literature in psychology, a deeper understanding of its usability in practice is still required. This EMICS SIG will provide an opportunity to discuss possible application scenarios and HCI interfaces to infer users' mental state from eye movements. It will bring together researchers across disciplines with the goal of expanding shared knowledge, discussing innovative research directions and methods, fostering future collaborations around the use of eye movements as an interface to cognitive state, and providing a solid foundation for an EMICS workshop at CHI 2021.

References

  1. Tobias Appel, Christian Scharinger, Peter Gerjets, and Enkelejda Kasneci. 2018. Cross-subject Workload Classification Using Pupil-related Measures. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications (ETRA '18). ACM, New York, NY, USA, Article 4, 8 pages.Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. G.R. Barnes. 2008. Cognitive processes involved in smooth pursuit eye movements. Brain and Cognition 68, 3 (2008), 309 -- 326. A Hundred Years of Eye Movement Research in Psychiatry.Google ScholarGoogle ScholarCross RefCross Ref
  3. Zoya Bylinskii, Phillip Isola, Constance Bainbridge, Antonio Torralba, and Aude Oliva. 2015. Intrinsic and extrinsic effects on image memorability. Vision research 116 (2015), 165--178.Google ScholarGoogle Scholar
  4. Zoya Bylinskii, Nam Wook Kim, Peter O'Donovan, Sami Alsheikh, Spandan Madan, Hanspeter Pfister, Fredo Durand, Bryan Russell, and Aaron Hertzmann. 2017. Learning Visual Importance for Graphic Designs and Data Visualizations. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology (UIST '17). ACM, New York, NY, USA, 57--69.Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Andrew T Duchowski. 2002. A breadth-first survey of eye-tracking applications. Behavior Research Methods, Instruments, & Computers 34, 4 (2002), 455--470.Google ScholarGoogle ScholarCross RefCross Ref
  6. Andrew T. Duchowski, Krzysztof Krejtz, Nina A. Gehrer, T. Bafna, and P. Baekgaard. 2020. The Low/High Index of Pupillary Activity. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (CHI '20). ACM, New York, NY, USA.Google ScholarGoogle Scholar
  7. Nicola Eger, Linden J. Ball, Robert Stevens, and Jon Dodd. 2007. Cueing Retrospective Verbal Reports in Usability Testing Through Eye-movement Replay. In Proceedings of the 21st British HCI Group Annual Conference on People and Computers: HCI...But Not As We Know It - Volume 1 (BCS-HCI '07). British Computer Society, Swinton, UK, UK, 129--137.Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Lex Fridman, Bryan Reimer, Bruce Mehler, and William T Freeman. 2018. Cognitive Load Estimation in the Wild. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, 652.Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Deborah E. Hannula and Charan Ranganath. 2009. The Eyes Have It: Hippocampal Activity Predicts Expression of Memory in Eye Movements. Neuron 63, 5 (2009), 592 -- 599.Google ScholarGoogle ScholarCross RefCross Ref
  10. Mary M Hayhoe. 2017. Vision and action. Annual review of vision science 3 (2017), 389--413.Google ScholarGoogle Scholar
  11. John M. Henderson, Svetlana V. Shinkareva, Jing Wang, Steven G. Luke, and Jenn Olejarczyk. 2013. Predicting Cognitive State from Eye Movements. PLOS ONE 8, 5 (05 2013), 1--6.Google ScholarGoogle Scholar
  12. Sebastian Hergeth, Lutz Lorenz, Roman Vilimek, and Josef F. Krems. 2016. Keep Your Scanners Peeled: Gaze Behavior as a Measure of Automation Trust During Highly Automated Driving. Human Factors 58, 3 (2016), 509--519.Google ScholarGoogle ScholarCross RefCross Ref
  13. Corey Holland, Oleg Komogortsev, and Dan Tamir. 2012. Identifying usability issues via algorithmic detection of excessive visual search. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '12). ACM, 2943--2952.Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Robert JK Jacob and Keith S Karn. 2003. Eye tracking in human-computer interaction and usability research: Ready to deliver the promises. In The mind's eye. Elsevier, 573--605.Google ScholarGoogle ScholarCross RefCross Ref
  15. Peter Kiefer, Ioannis Giannopoulos, Martin Raubal, and Andrew Duchowski. 2017. Eye tracking for spatial research: Cognition, computation, challenges. Spatial Cognition & Computation 17, 1--2 (2017), 1--19.Google ScholarGoogle ScholarCross RefCross Ref
  16. Simon P. Liversedge and John M. Findlay. 2000. Saccadic eye movements and cognition. Trends in Cognitive Sciences 4, 1 (2000), 6 -- 14.Google ScholarGoogle ScholarCross RefCross Ref
  17. Alexandra Papoutsaki, James Laskey, and Jeff Huang. 2016. SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In Proceedings of the ACM SIGIR Conference on Human Information Interaction & Retrieval (CHIIR). ACM.Google ScholarGoogle Scholar
  18. Alex Poole and Linden J Ball. 2006. Eye tracking in HCI and usability research. In Encyclopedia of human computer interaction. IGI Global, 211--219.Google ScholarGoogle Scholar
  19. Antonio Torralba, Aude Oliva, Monica S Castelhano, and John M Henderson. 2006. Contextual guidance of eye movements and attention in real-world scenes: the role of global features in object search. Psychological review 113, 4 (2006), 766--786.Google ScholarGoogle Scholar
  20. Sarah A Vitak, John E Ingram, Andrew T Duchowski, Steven Ellis, and Anand K Gramopadhye. 2012. Gaze-augmented think-aloud as an aid to learning. In Proceedings of the SIGCHI conference on human factors in computing systems (CHI '12). ACM, 2991--3000.Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Xi Wang, Andreas Ley, Sebastian Koch, David Lindlbauer, James Hays, Kenneth Holmqvist, and Marc Alexa. 2019. The Mental Image Revealed by Gaze Tracking. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI '19). ACM, New York, NY, USA, Article 609, 12 pages.Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. EMICS'20: Eye Movements as an Interface to Cognitive State

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        CHI EA '20: Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems
        April 2020
        4474 pages
        ISBN:9781450368193
        DOI:10.1145/3334480

        Copyright © 2020 Owner/Author

        Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 25 April 2020

        Check for updates

        Qualifiers

        • abstract

        Acceptance Rates

        Overall Acceptance Rate6,164of23,696submissions,26%

        Upcoming Conference

        CHI PLAY '24
        The Annual Symposium on Computer-Human Interaction in Play
        October 14 - 17, 2024
        Tampere , Finland

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format .

      View HTML Format