skip to main content
10.1145/3603555.3603560acmotherconferencesArticle/Chapter ViewAbstractPublication PagesmucConference Proceedingsconference-collections
research-article
Open Access

Behind the Screens: Exploring Eye Movement Visualization to Optimize Online Teaching and Learning

Published:03 September 2023Publication History

ABSTRACT

The effective delivery of e-learning depends on the continuous monitoring and management of student attention. While instructors in traditional classroom settings can easily assess crowd attention through gaze cues, these cues are largely unavailable in online learning environments. To address this challenge and highlight the significance of our study, we collected eye movement data from twenty students and developed four visualization methods: (a) a heat map, (b) an ellipse map, (c) two moving bars, and (d) a vertical bar, which were overlaid on 13 instructional videos. Our results revealed unexpected preferences among the instructors. Contrary to expectations, they did not prefer the established heat map and vertical bar for live online instruction. Instead, they chose the less intrusive ellipse visualization. Nevertheless, the heat map remained the preferred choice for retrospective analysis due to its more detailed information. Importantly, all visualizations were found to be useful and to help restore emotional connections in online learning. In conclusion, our innovative visualizations of crowd attention show considerable potential for a wide range of applications, extending beyond e-learning to all online presentations and retrospective analyses. The significant results of our study underscore the critical role these visualizations will play in enhancing both the effectiveness and emotional connectedness of future e-learning experiences, thereby facilitating the educational landscape.

References

  1. Noor Z. Al Dahhan, John R. Kirby, and Douglas P. Munoz. 2016. Understanding Reading and Reading Difficulties Through Naming Speed Tasks: Bridging the Gaps Among Neuroscience, Cognition, and Education. AERA Open 2, 4 (Oct. 2016), 2332858416675346. https://doi.org/10.1177/2332858416675346Google ScholarGoogle ScholarCross RefCross Ref
  2. Marvin Andujar and Juan E. Gilbert. 2017. A User-Centered Approach towards Attention Visualization for Learning Activities. In Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers(UbiComp ’17). Association for Computing Machinery, New York, NY, USA, 871–876. https://doi.org/10.1145/3123024.3125505Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Arthur Aron, Elaine N. Aron, and Danny Smollan. 1992. Inclusion of Other in the Self Scale and the Structure of Interpersonal Closeness. Journal of Personality and Social Psychology 63, 4 (1992), 596–612. https://doi.org/10.1037/0022-3514.63.4.596Google ScholarGoogle ScholarCross RefCross Ref
  4. T. Blascheck, K. Kurzhals, M. Raschke, M. Burch, D. Weiskopf, and T. Ertl. 2017. Visualization of Eye Tracking Data: A Taxonomy and Survey. Computer Graphics Forum 36, 8 (2017), 260–284. https://doi.org/10.1111/cgf.13079 arXiv:https://onlinelibrary.wiley.com/doi/pdf/10.1111/cgf.13079Google ScholarGoogle ScholarCross RefCross Ref
  5. Agnieszka (Aga) Bojko. 2009. Informative or Misleading? Heatmaps Deconstructed. In Human-Computer Interaction. New Trends(Lecture Notes in Computer Science), Julie A. Jacko (Ed.). Springer, Berlin, Heidelberg, 30–39. https://doi.org/10.1007/978-3-642-02574-7_4Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Daria Bondareva, Cristina Conati, Reza Feyzi-Behnagh, Jason M Harley, Roger Azevedo, and François Bouchet. 2013. Artificial Intelligence in Education. Lecture Notes in Computer Science (2013), 229–238. https://doi.org/10.1007/978-3-642-39112-5_24Google ScholarGoogle ScholarCross RefCross Ref
  7. Michael Burch, Ayush Kumar, and Neil Timmermans. 2019-06-25, 2019. An Interactive Web-Based Visual Analytics Tool for Detecting Strategic Eye Movement Patterns. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications(ETRA ’19). Association for Computing Machinery, New York, NY, USA, 1–5. https://doi.org/10.1145/3317960.3321615Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Justine Cassell. 2000. Embodied Conversational Interface Agents. Commun. ACM 43, 4 (April 2000), 70–78. https://doi.org/10.1145/332051.332075Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Quincy Conley, Yvonne Earnshaw, and Grayley McWatters. 2020-02-25, 2020. Examining Course Layouts in Blackboard: Using Eye-Tracking to Evaluate Usability in a Learning Management System. International Journal of Human–Computer Interaction 36, 4 (2020-02-25, 2020), 373–385. https://doi.org/10.1080/10447318.2019.1644841Google ScholarGoogle ScholarCross RefCross Ref
  10. Edwin S. Dalmaijer, Sebastiaan Mathôt, and Stefan Van der Stigchel. 2014. PyGaze: An Open-Source, Cross-Platform Toolbox for Minimal-Effort Programming of Eyetracking Experiments. Behavior Research Methods 46, 4 (2014), 913–921. https://doi.org/10.3758/s13428-013-0422-2Google ScholarGoogle ScholarCross RefCross Ref
  11. Erhan Delen and Jeffrey Liew. 2016. The Use of Interactive Environments to Promote Self-Regulation in Online Learning: A Literature Review. European Journal of Contemporary Education 15, 1 (2016), 24–33.Google ScholarGoogle Scholar
  12. Malinda Desjarlais. 2017. The Use of Eye-Gaze to Understand Multimedia Learning. In Eye-Tracking Technology Applications in Educational Research. IGI global, 122–142.Google ScholarGoogle Scholar
  13. Andrew Emerson, Robert Sawyer, Roger Azevedo, and James Lester. 2018. Gaze-Enhanced Student Modeling for Game-based Learning. In Proceedings of the 26th Conference on User Modeling, Adaptation and Personalization(UMAP ’18). Association for Computing Machinery, New York, NY, USA, 63–72. https://doi.org/10.1145/3209219.3209238Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Jr Fred D. Davis. 1985. A Technology Acceptance Model for Empirically Testing New End-User Information Systems: Theory and Results. Ph. D. Dissertation. Sloan School of Management at the Massachusetts Institute of Technology.Google ScholarGoogle Scholar
  15. Ben Fry and Casey Reas. 2022. Processing. Retrieved 15 September, 2022 from https://processing.org//Google ScholarGoogle Scholar
  16. Elena L. Glassman, Juho Kim, Andrés Monroy-Hernández, and Meredith Ringel Morris. 2015. Mudslide: A Spatially Anchored Census of Student Confusion for Online Lecture Videos. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems(CHI ’15). Association for Computing Machinery, New York, NY, USA, 1555–1564. https://doi.org/10.1145/2702123.2702304Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Quentin Guillon, Nouchine Hadjikhani, Sophie Baduel, and Bernadette Rogé. 2014. Visual Social Attention in Autism Spectrum Disorder: Insights from Eye Tracking Studies. Neuroscience & Biobehavioral Reviews 42 (2014), 279–297. https://doi.org/10.1016/j.neubiorev.2014.03.013Google ScholarGoogle ScholarCross RefCross Ref
  18. Mariam Hassib, Stefan Schneegass, Philipp Eiglsperger, Niels Henze, Albrecht Schmidt, and Florian Alt. 2017. EngageMeter: A System for Implicit Audience Engagement Sensing Using Electroencephalography. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems(CHI ’17). Association for Computing Machinery, New York, NY, USA, 5114–5119. https://doi.org/10.1145/3025453.3025669Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Teresa Hirzle, Marian Sauter, Tobias Wagner, Susanne Hummel, Enrico Rukzio, and Anke Huckauf. 2022. Attention of Many Observers Visualized by Eye Movements. In 2022 Symposium on Eye Tracking Research and Applications. Association for Computing Machinery, Seattle, WA, USA, Article 65.Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Poika Isokoski, Jari Kangas, and Päivi Majaranta. 2018-06-14, 2018. Useful Approaches to Exploratory Analysis of Gaze Data: Enhanced Heatmaps, Cluster Maps, and Transition Maps. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications(ETRA ’18). Association for Computing Machinery, New York, NY, USA, 1–9. https://doi.org/10.1145/3204493.3204591Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Samad Kardan and Cristina Conati. 2012. User Modeling, Adaptation, and Personalization. Lecture Notes in Computer Science (2012), 126–138. https://doi.org/10.1007/978-3-642-31454-4_11Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Howard S Kimmel, John D Carpinelli, Gale T Spak, and Ronald H Rockland. 2020. A Methodology for Retaining Student Learning during the Pandemic. Educational practices during the COVID-19 viral outbreak: International perspectives 1 (2020), 1–18.Google ScholarGoogle Scholar
  23. Jacob Leon Kröger, Otto Hans-Martin Lutz, and Florian Müller. 2020. What Does Your Gaze Reveal about You? On the Privacy Implications of Eye Tracking. In Privacy and Identity Management. Data for Better Living: AI and Privacy: 14th IFIP WG 9.2, 9.6/11.7, 11.6/SIG 9.2.2 International Summer School, Windisch, Switzerland, August 19–23, 2019, Revised Selected Papers, Michael Friedewald, Melek Önen, Eva Lievens, Stephan Krenn, and Samuel Fricker (Eds.). Springer International Publishing, Cham, 226–241. https://doi.org/10.1007/978-3-030-42504-3_15Google ScholarGoogle ScholarCross RefCross Ref
  24. Meng-Lung Lai, Meng-Jung Tsai, Fang-Ying Yang, Chung-Yuan Hsu, Tzu-Chien Liu, Silvia Wen-Yu Lee, Min-Hsien Lee, Guo-Li Chiou, Jyh-Chong Liang, and Chin-Chung Tsai. 2013. A Review of Using Eye-Tracking Technology in Exploring Learning from 2000 to 2012. Educational Research Review 10 (2013), 90–115. https://doi.org/10.1016/j.edurev.2013.10.001Google ScholarGoogle ScholarCross RefCross Ref
  25. T. Soni Madhulatha. 2012. An Overview on Clustering Methods. https://doi.org/10.48550/arXiv.1205.1117 arxiv:1205.1117 [cs]Google ScholarGoogle ScholarCross RefCross Ref
  26. Sebastiaan Mathôt, Daniel Schreij, and Jan Theeuwes. 2012. OpenSesame: An Open-Source, Graphical Experiment Builder for the Social Sciences. Behavior Research Methods 44, 2 (June 2012), 314–324. https://doi.org/10.3758/s13428-011-0168-7Google ScholarGoogle ScholarCross RefCross Ref
  27. Cedric Bheki Mpungose. 2021. Lecturers’ Reflections on Use of Zoom Video Conferencing Technology for e-Learning at a South African University in the Context of Coronavirus. African Identities 0, 0 (March 2021), 1–17. https://doi.org/10.1080/14725843.2021.1902268Google ScholarGoogle ScholarCross RefCross Ref
  28. Prasanth Murali, Javier Hernandez, Daniel McDuff, Kael Rowan, Jina Suh, and Mary Czerwinski. 2021. AffectiveSpotlight: Facilitating the Communication of Affective Responses from Audience Members during Online Presentations. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, New York, NY, USA.Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Prasanth Murali, Lazlo Ring, Ha Trinh, Reza Asadi, and Timothy Bickmore. 2018. Speaker Hand-Offs in Collaborative Human-Agent Oral Presentations. In Proceedings of the 18th International Conference on Intelligent Virtual Agents(IVA ’18). Association for Computing Machinery, New York, NY, USA, 153–158. https://doi.org/10.1145/3267851.3267904Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. R. S. Oeppen, G. Shaw, and P. A. Brennan. 2020. Human Factors Recognition at Virtual Meetings and Video Conferencing: How to Get the Best Performance from Yourself and Others. British Journal of Oral and Maxillofacial Surgery 58, 6 (July 2020), 643–646. https://doi.org/10.1016/j.bjoms.2020.04.046Google ScholarGoogle ScholarCross RefCross Ref
  31. Dhaval Parmar and Timothy Bickmore. 2020. Making It Personal: Addressing Individual Audience Members in Oral Presentations Using Augmented Reality. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 4, 2, Article 55 (June 2020). https://doi.org/10.1145/3397336Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Thies Pfeiffer and Cem Memili. 2016-03-14, 2016. Model-Based Real-Time Visualization of Realistic Three-Dimensional Heat Maps for Mobile Eye Tracking and Eye Tracking in Virtual Reality. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications(ETRA ’16). Association for Computing Machinery, New York, NY, USA, 95–102. https://doi.org/10.1145/2857491.2857541Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. Michael Raschke, Tanja Blascheck, and Michael Burch. 2014. Visual Analysis of Eye Tracking Data. In Handbook of Human Centric Visualization, Weidong Huang (Ed.). Springer New York, New York, NY, 391–409. https://doi.org/10.1007/978-1-4614-7485-2_15Google ScholarGoogle ScholarCross RefCross Ref
  34. Tobias Roeddiger. 2022. Gaze Point Heat Map. Retrieved 14 September, 2022 from https://github.com/TobiasRoeddiger/GazePointHeatMapGoogle ScholarGoogle Scholar
  35. Marian Sauter, Teresa Hirzle, Tobias Wagner, Susanne Hummel, Enrico Rukzio, and Anke Huckauf. 2022. Can Eye Movement Synchronicity Predict Test Performance With Unreliably-Sampled Data in an Online Learning Context?. In 2022 Symposium on Eye Tracking Research and Applications. Association for Computing Machinery, Seattle, WA, USA, Article 47.Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Marian Sauter, Tobias Wagner, and Anke Huckauf. 2022. Distance between Gaze and Laser Pointer Predicts Performance in Video-Based e-Learning Independent of the Presence of an on-Screen Instructor. In 2022 Symposium on Eye Tracking Research and Applications. Association for Computing Machinery, Seattle, WA, USA, Article 26.Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. Daniel L. Schacter and Karl K. Szpunar. 2015. Enhancing Attention and Memory during Video-Recorded Lectures. Scholarship of Teaching and Learning in Psychology 1 (2015), 60–71. https://doi.org/10.1037/stl0000011Google ScholarGoogle ScholarCross RefCross Ref
  38. Christina Schneegass, Thomas Kosch, Albrecht Schmidt, and Heinrich Hussmann. 2019. Investigating the Potential of EEG for Implicit Detection of Unknown Words for Foreign Language Learning. In Human-Computer Interaction – INTERACT 2019(Lecture Notes in Computer Science), David Lamas, Fernando Loizides, Lennart Nacke, Helen Petrie, Marco Winckler, and Panayiotis Zaphiris (Eds.). Springer International Publishing, Cham, 293–313. https://doi.org/10.1007/978-3-030-29387-1_17Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. Donald A Schön. 1987. Educating the reflective practitioner: Toward a new design for teaching and learning in the professions.Jossey-Bass.Google ScholarGoogle Scholar
  40. Kshitij Sharma, Michail Giannakos, and Pierre Dillenbourg. 2020-12, 2020. Eye-Tracking and Artificial Intelligence to Enhance Motivation and Learning. Smart Learning Environments 7, 1 (2020-12, 2020), 1–19. https://doi.org/10.1186/s40561-020-00122-xGoogle ScholarGoogle ScholarCross RefCross Ref
  41. Barbora Siposova and Malinda Carpenter. 2019. A New Look at Joint Attention and Common Knowledge. Cognition 189 (2019), 260–274. https://doi.org/10.1016/j.cognition.2019.03.019Google ScholarGoogle ScholarCross RefCross Ref
  42. O. Špakov and D. Miniotas. 2007. Visualization of Eye Gaze Data Using Heat Maps. Elektronika ir Elektrotechnika 74, 2 (Feb. 2007), 55–58.Google ScholarGoogle Scholar
  43. Wei Sun, Yunzhi Li, Feng Tian, Xiangmin Fan, and Hongan Wang. 2019. How Presenters Perceive and React to Audience Flow Prediction In-Situ: An Explorative Study of Live Online Lectures. Proceedings of the ACM on Human-Computer Interaction 3, CSCW (2019), 1–19. https://doi.org/10.1145/3359264Google ScholarGoogle ScholarDigital LibraryDigital Library
  44. Zhongqiang Sun, Wenjun Yu, Jifan Zhou, and Mowei Shen. 2017. Perceiving Crowd Attention: Gaze Following in Human Crowds with Conflicting Cues. Attention, Perception, & Psychophysics 79, 4 (May 2017), 1039–1049. https://doi.org/10.3758/s13414-017-1303-zGoogle ScholarGoogle ScholarCross RefCross Ref
  45. Nancy Yao, Jeff Brewer, Sarah D’Angelo, Mike Horn, and Darren Gergle. 2018. Visualizing Gaze Information from Multiple Students to Support Remote Instruction. In Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, Montreal QC, Canada, Paper LBW051.Google ScholarGoogle ScholarDigital LibraryDigital Library
  46. FRH Zijlstra and L Van Doorn. 1985. The Construction of a Scale to Measure Perceived Effort. University of Technology.Google ScholarGoogle Scholar

Index Terms

  1. Behind the Screens: Exploring Eye Movement Visualization to Optimize Online Teaching and Learning

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Article Metrics

        • Downloads (Last 12 months)188
        • Downloads (Last 6 weeks)18

        Other Metrics

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format .

      View HTML Format