ABSTRACT
Eye movement recording has been extensively used in HCI and offers the possibility to understand how information is perceived and processed by users. Hardware developments provide the ubiquitous accessibility of eye recording, allowing eye movements to enter common usage as a control modality. Recent A.I. developments provide powerful computational means to make predictions about the user. However, the connection between eye movements and cognitive state has been largely under-exploited in HCI. Despite the rich literature in psychology, a deeper understanding of its usability in practice is still required. This EMICS SIG will provide an opportunity to discuss possible application scenarios and HCI interfaces to infer users' mental state from eye movements. It will bring together researchers across disciplines with the goal of expanding shared knowledge, discussing innovative research directions and methods, fostering future collaborations around the use of eye movements as an interface to cognitive state, and providing a solid foundation for an EMICS workshop at CHI 2021.
- Tobias Appel, Christian Scharinger, Peter Gerjets, and Enkelejda Kasneci. 2018. Cross-subject Workload Classification Using Pupil-related Measures. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications (ETRA '18). ACM, New York, NY, USA, Article 4, 8 pages.Google ScholarDigital Library
- G.R. Barnes. 2008. Cognitive processes involved in smooth pursuit eye movements. Brain and Cognition 68, 3 (2008), 309 -- 326. A Hundred Years of Eye Movement Research in Psychiatry.Google ScholarCross Ref
- Zoya Bylinskii, Phillip Isola, Constance Bainbridge, Antonio Torralba, and Aude Oliva. 2015. Intrinsic and extrinsic effects on image memorability. Vision research 116 (2015), 165--178.Google Scholar
- Zoya Bylinskii, Nam Wook Kim, Peter O'Donovan, Sami Alsheikh, Spandan Madan, Hanspeter Pfister, Fredo Durand, Bryan Russell, and Aaron Hertzmann. 2017. Learning Visual Importance for Graphic Designs and Data Visualizations. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology (UIST '17). ACM, New York, NY, USA, 57--69.Google ScholarDigital Library
- Andrew T Duchowski. 2002. A breadth-first survey of eye-tracking applications. Behavior Research Methods, Instruments, & Computers 34, 4 (2002), 455--470.Google ScholarCross Ref
- Andrew T. Duchowski, Krzysztof Krejtz, Nina A. Gehrer, T. Bafna, and P. Baekgaard. 2020. The Low/High Index of Pupillary Activity. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (CHI '20). ACM, New York, NY, USA.Google Scholar
- Nicola Eger, Linden J. Ball, Robert Stevens, and Jon Dodd. 2007. Cueing Retrospective Verbal Reports in Usability Testing Through Eye-movement Replay. In Proceedings of the 21st British HCI Group Annual Conference on People and Computers: HCI...But Not As We Know It - Volume 1 (BCS-HCI '07). British Computer Society, Swinton, UK, UK, 129--137.Google ScholarDigital Library
- Lex Fridman, Bryan Reimer, Bruce Mehler, and William T Freeman. 2018. Cognitive Load Estimation in the Wild. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, 652.Google ScholarDigital Library
- Deborah E. Hannula and Charan Ranganath. 2009. The Eyes Have It: Hippocampal Activity Predicts Expression of Memory in Eye Movements. Neuron 63, 5 (2009), 592 -- 599.Google ScholarCross Ref
- Mary M Hayhoe. 2017. Vision and action. Annual review of vision science 3 (2017), 389--413.Google Scholar
- John M. Henderson, Svetlana V. Shinkareva, Jing Wang, Steven G. Luke, and Jenn Olejarczyk. 2013. Predicting Cognitive State from Eye Movements. PLOS ONE 8, 5 (05 2013), 1--6.Google Scholar
- Sebastian Hergeth, Lutz Lorenz, Roman Vilimek, and Josef F. Krems. 2016. Keep Your Scanners Peeled: Gaze Behavior as a Measure of Automation Trust During Highly Automated Driving. Human Factors 58, 3 (2016), 509--519.Google ScholarCross Ref
- Corey Holland, Oleg Komogortsev, and Dan Tamir. 2012. Identifying usability issues via algorithmic detection of excessive visual search. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '12). ACM, 2943--2952.Google ScholarDigital Library
- Robert JK Jacob and Keith S Karn. 2003. Eye tracking in human-computer interaction and usability research: Ready to deliver the promises. In The mind's eye. Elsevier, 573--605.Google ScholarCross Ref
- Peter Kiefer, Ioannis Giannopoulos, Martin Raubal, and Andrew Duchowski. 2017. Eye tracking for spatial research: Cognition, computation, challenges. Spatial Cognition & Computation 17, 1--2 (2017), 1--19.Google ScholarCross Ref
- Simon P. Liversedge and John M. Findlay. 2000. Saccadic eye movements and cognition. Trends in Cognitive Sciences 4, 1 (2000), 6 -- 14.Google ScholarCross Ref
- Alexandra Papoutsaki, James Laskey, and Jeff Huang. 2016. SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search. In Proceedings of the ACM SIGIR Conference on Human Information Interaction & Retrieval (CHIIR). ACM.Google Scholar
- Alex Poole and Linden J Ball. 2006. Eye tracking in HCI and usability research. In Encyclopedia of human computer interaction. IGI Global, 211--219.Google Scholar
- Antonio Torralba, Aude Oliva, Monica S Castelhano, and John M Henderson. 2006. Contextual guidance of eye movements and attention in real-world scenes: the role of global features in object search. Psychological review 113, 4 (2006), 766--786.Google Scholar
- Sarah A Vitak, John E Ingram, Andrew T Duchowski, Steven Ellis, and Anand K Gramopadhye. 2012. Gaze-augmented think-aloud as an aid to learning. In Proceedings of the SIGCHI conference on human factors in computing systems (CHI '12). ACM, 2991--3000.Google ScholarDigital Library
- Xi Wang, Andreas Ley, Sebastian Koch, David Lindlbauer, James Hays, Kenneth Holmqvist, and Marc Alexa. 2019. The Mental Image Revealed by Gaze Tracking. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI '19). ACM, New York, NY, USA, Article 609, 12 pages.Google ScholarDigital Library
Index Terms
- EMICS'20: Eye Movements as an Interface to Cognitive State
Recommendations
EMICS’21: Eye Movements as an Interface to Cognitive State
CHI EA '21: Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing SystemsEye movement recording has been extensively used in HCI and offers the possibility to understand how information is perceived and processed by users. Hardware developments provide the ubiquitous accessibility of eye recording, allowing eye movements to ...
A method to recognize eye movements based on uplift movement of skin
UbiComp/ISWC '19 Adjunct: Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable ComputersMovements of human eyes provide useful insights for understanding human's physical and mental condition and ability, which has high attention in various industrial and academic fields. Although many methods for sensing eye movements have been proposed, ...
Improvement of Determination Algorithm for Eye Glance Input Interface
In our previous studies, an eye control input interface using an electrooculograph EOG amplified by ac coupling has been developed. We proposed an eye gesture input interface using a combination of several eye movements. It is unnecessary for eye ...
Comments