Skip to main content
Log in

Investigating Student Sustained Attention in a Guided Inquiry Lecture Course Using an Eye Tracker

  • Review Article
  • Published:
Educational Psychology Review Aims and scope Submit manuscript

Abstract

This study investigated the belief that student attention declines after the first 10 to 15 min of class by analyzing vigilance decrement in a guided inquiry physical science course. We used Tobii Glasses, a portable eye tracker, to record student gaze during class sessions. Undergraduate students (n = 17) representative of course demographics (14 female, 3 male) wore the eye tracker during 70-min classes (n = 84) or 50-min classes (n = 26). From the gaze point and fixation data, we coded participant attention as either on-task or off-task for every second of data. This analysis resulted in a percentage of vigilance time on task for each minute as well as the amount of time that participants spent looking in various locations during the class sessions. Participants exhibited on-task vigilance percentages starting with 67% at the start of class and rising to an average of above 90% on-task vigilance at the 7 to 9-min mark with minor fluctuation. Contrary to the belief that attention declines rapidly during a class, the participants on-task spans were larger and more numerous than their off-task spans. These results seem to support the conclusion that well-structured classes punctuated by student-student and instructor-student interactions can be an effective method of maintaining student attention vigilance for entire class sessions, not just the first 10 min.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  • Bradbury, N. (2016). Attention span during lectures: 8 seconds, 10 minutes, or more? Advanced Physiology Education, 40(4), 509–513. https://doi.org/10.1152/advan.00109.2016.

    Article  Google Scholar 

  • Bunce, D. M., Flens, E. A., & Neiles, K. Y. (2010). How long can students pay attention in a class? A study of student attention decline using clickers. Journal of Chemical Education, 87(12), 1438–1443. https://doi.org/10.1021/ed100409p.

    Article  Google Scholar 

  • Craig, A. (1984). Human engineering: the control of vigilance. In J. S. Warm (Ed.), Sustained attention in human performance (pp. 247–291). Chichester: Wiley ISBN-13: 978-0471103226.

    Google Scholar 

  • Davis, B. (1993). Tools for teaching. Jossey-Bass ISBN 9780787965679.

  • Etkina, E., & Van Heuvelen, A. (2007). Investigative science learning environment—a science process approach to learning physics. Research-based reform of university physics, 1(1), 1–48.

    Google Scholar 

  • Hartley, J., & Davis, J. (1978). Note taking: a critical review. Programmed Learning and Educational Technology, 15(3), 207–224. https://doi.org/10.1080/0033039780150305.

    Article  Google Scholar 

  • Johnstone, A., & Percival, F. (1976). Attention breaks in lectures. Education in Chemistry, 13(2), 49–50.

    Google Scholar 

  • Kohl, P., Rosengrant, D., & Finkelstein, N. (2007). Comparing explicit and implicit teaching of multiple representation use in physics problem solving. In AIP Conference Proceedings (Vol. 883, No. 1, pp. 145–148). College Park: American Institute of Physics.

  • Lloyd, D. (1968). A concept of improvement of learning responses in the taught lesson. In Visual Education (pp. 23–25).

    Google Scholar 

  • Lucas, S. G., Bernstein, D. A., & Goss-Lucas, S. (2004). Teaching psychology: a step by step guide. Psychology Press ISBN-13: 978-1138790346.

  • Macmanaway, L. (1970). Teaching methods in higher education, innovation and research. Universities Quarterly, 24(3), 321–329. https://doi.org/10.1111/j.1468-2273.1970.tb00346.x.

    Article  Google Scholar 

  • Maddox, H., & Hoole, E. (1975). Performance decrement in the lecture. Educational Review, 28(1), 17–30. https://doi.org/10.1080/0013191750280102.

    Article  Google Scholar 

  • McCambridge, J., Witton, J., & Elbourne, D. (2014). Systematic review of the Hawthorne effect: new concepts are needed to study research participation effects. Journal of Clinical Epidemiology, 67(3), 267–277. https://doi.org/10.1016/j.jclinepi.2013.08.015.

    Article  Google Scholar 

  • McKeachie, W. (1986). Teaching tips: a guidebook for beginning college teachers (8th ed.). Heath. https://doi.org/10.1016/0307-4412(88)90090-8.

  • McKeachie, W. (1999). Teaching tips: strategies, research, and theory for college and university teachers (10th ed.). Heath. https://doi.org/10.2307/328598.

  • McLeish, J. (1968). The lecture method. Cambridge: Cambridge Institute of Education.

    Google Scholar 

  • Naveh-Benjamin, M. (2002). The effects of divided attention on encoding processes: underlying mechanisms. Perspectives on human memory and cognitive aging: Essays in honor of Fergus Craik, 193-1207. ISBN 9781134949694.

  • Parasuraman, R. (1986). Vigilance, monitoring, and search. In K. R. Boff, L. Kaufman, & J. P. Thomas (Eds.), Handbook of perception and human performance, Vol. 2. Cognitive processes and performance (pp. 1–39). John Wiley & Sons ISBN 0471829560.

  • Parasuraman, R., Warm, J. S., & Dember, W. N. (1987). Vigilance: taxonomy and utility. In L. S. Mark, J. S. Warm, & R. L. Huston (Eds.), Ergonomics and human factors (pp. 11–32). New York: Springer-Verlag. https://doi.org/10.1007/978-1-4612-4756-2_2.

    Chapter  Google Scholar 

  • Rosengrant, D. (2007). Multiple representations and free-body diagrams: do students benefit from using them? Rutgers University: ProQuest Dissertations Publishing.

  • Rosengrant, D. (2011). Impulse-Momentum Diagrams. The Physics Teacher, 49(1), 36–39.

  • Rosengrant, D., Hearrington, D., *Alvarado, K., & *Keeble D. (2011). Following Student Gaze Patterns in Physical Science Lectures. 2011 Physics Education Research Conference Proceedings, 1413:323–326.

  • Sarter, M., Givens, B., & Bruno, J. P. (2001). The cognitive neuroscience of sustained attention: where top-down meets bottom-up. Brain Research Reviews, 35(2), 146–160. https://doi.org/10.1016/s0165-0173(01)00044-3.

    Article  Google Scholar 

  • Stuart, J., & Rutherford, R. (1978). Medical student concentration during lectures. The Lancet, 514-516(8088), 514–516. https://doi.org/10.1016/s0140-6736(78)92233-x.

    Article  Google Scholar 

  • Wankat, P. (2002). The effective, efficient professor: teaching, scholarship and service. Allyn & Bacon. https://doi.org/10.1080/1937156x.2003.11949512.

  • Warm, J. S., Parasuraman, R., & Matthews, G. (2008). Vigilance requires hard mental work and is stressful. Human Factors, 50(3), 433–441. https://doi.org/10.1518/001872008X312152.

    Article  Google Scholar 

  • Wilson, K., & Korn, J. (2007). Attention during lectures: beyond ten minutes. Teaching of Psychology, 34(2), 85–89. https://doi.org/10.1080/00986280701291291.

    Article  Google Scholar 

  • Young, M. S., Robinson, S., & Alberts, P. (2009). Students pay attention! Combating the vigilance decrement to improve learning during lectures. Active Learning in Higher Education, 10(1), 41–55. https://doi.org/10.1177/1469787408100194.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to David Rosengrant.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix. Data coding methodology

Appendix. Data coding methodology

On-task gaze focus included the following subcategories: professor, board (whiteboard with notes and power point projections), notes by the students, classmate (discussion pertaining to class), assessment (quizzes), and other (an activity relevant to class but not one of the previously mentioned categories such as participating in a demo). Off-task gaze focus included the following subcategories: classmate (activity not pertaining to class), computer/electronic device, room, doodling, and other. We developed the subcategory codes during a pilot study (Rosengrant et al. 2011) that contained eight eye tracking videos (results are not part of this study). Each researcher developed a list of objects of attention, to include whether that attention was on- or off-task. We compared codes to reach consensus between the researchers until 100% agreement was reached. As the 2011 pilot study progressed, the initial coding scheme was analyzed based on the newest participant video. We found no need to add or modify our coding scheme as this full study progressed.

To ensure that the videos were coded properly, each coder underwent a training. The coder watched a video that was already coded and compared their work with the accepted codes. Once any disagreements were resolved (usually minor), they then coded new videos on their own. Author 1 then coded random minutes of new videos and compared that work with the coder to ensure agreement based upon the already accepted codes.

At extreme angles, the eye tracker tended to lose where the eye was looking. For example, a student taking notes would typically look downward with their eyes and not move their entire head downward. Though there was no eye tracking data, this gaze focus was still coded as on-task because the video showed participants actively taking notes. Similarly, if participants were looking downward to send/receive text messages on their cell phones, we labeled that as off-task. When a participant had their gaze focused for any length of time and eye tracking data were recorded, there were noticeable changes in the eye movements. For instance, a participant may have been watching the instructor, but the red dot showing their gaze would constantly be moving. There were very few times where the eye tracking data showed very little to no movement. These rare cases with no eye motion (exceeding 2 s) were labeled as off-task since we considered this a zoning out instance regardless of the where the dot was located.

We labeled situations in which we had no eye tracking data and we could not tell what the participant was doing as no data. If there was video data but no audio data for some reason (this happened rarely), we also coded this as no data as the verbal discussions could be used to help determine on- and off-task coding. We also created a code called “down time.” This was for any situation where students were in class but there was no instruction happening, or when the class took a short break. This was typically after a quiz when the instructor was collecting papers or if the instructor was distributing papers such as exams.

As we stated earlier in the body of the paper, our method allowed for a minute-by-minute determination of on- or off-task gaze focus for each participant within the specific subcategories. Within a semester, the percentages of gaze duration are the total number of seconds a participant looked in a subcategory focus area divided by the total time of data collected for the semester. For each 70-min class, there were 60 data points per minute, multiplied by 70 min, for a maximum of 4200 data points. For each minute of video, on- and off-task codes were averaged to determine the overall task status for that minute. For example, if a participant were on task 60 s out of 60 s, then they had a 100% on-task for that minute. If they were on task for 53 s out of 60, then they had an on task percentage of 88.3% (53/60 = .883). If during that minute, part of the data included down time or there was no data, then that time was removed from the 60 s and we averaged the remainder. Thus, a participant was on-task for 42 s out of 60, off-task for 13 s, and had 5 s of no data, we coded that minute as 76.4% on-task (42/55 = .764). Thus, our overall equation for each minute was dividing the number of seconds a participant spent each minute on task by the number of seconds spent either on or off-task. If the bulk (more than 75%) of a minute was categorized as no data, then this minute was not calculated in the on-task percentage. For example, if 55 s were downtime and the remaining 5 s were on-task, we did not code this as 100% on task for that minute. When we reported the data for averages, it was based on the sum of the seconds per code each semester, not on an average of daily values.

Once we had our percentages for each minute, we extended this analysis to the semester as a whole in two ways. In both ways, we started with displaying the mean percentage of time on task on a minute-by-minute basis for all participants during each semester of the study. We averaged each minute of class meeting time to give a percent of time that the student was on-task as described previously. For our first semester analysis we then averaged each minute of a class session with the corresponding minute of the other class sessions in that same semester unless there was no data for that particular minute. For example, if minute 23 of class 4 was no data (which could include down time) then minute 23 of class 4 was not averaged in with minute 23 of all of the other classes. Thus, every other minute 23 of a class session would be averaged together to form the timelines presented in the results section.

For our second semester analysis, we operationalized vigilance in three levels: high (on-task focus 90 to 100% of the coded minute), medium (on-task focus 70 to 89.99% of the coded minute), and low (on-task focus less than 70% of the coded minute). We chose 90% as our cutoff because this allowed for small variations in a minute where a person may have quickly glanced at something else coded as off-task but still maintained on task activity for nearly all of the minute. We chose 70% as the cutoff for low since we would rather be overly critical in this area. So, even though a student may have been paying attention for a majority of the minute, they typically had larger chunks of time of off-task behavior. Next, we calculated durations of vigilance for each level. If a student was in the high category for seven consecutive minutes and then exhibited another level of vigilance or down time in the next minute, we counted that as one high level vigilance for a 7-min duration. We then counted the number and length of all vigilance durations of all three levels in every class session every semester. Next, we calculated (per level per semester); the average duration of vigilance (along with SDs), the total number of minutes, the longest single duration, and the number of durations that occurred longer than 5 min. We chose 5 min since we felt this was the minimum duration demonstrating sustained attention and it was relatively close to the averages we discovered in most of the semesters.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Rosengrant, D., Hearrington, D. & O’Brien, J. Investigating Student Sustained Attention in a Guided Inquiry Lecture Course Using an Eye Tracker. Educ Psychol Rev 33, 11–26 (2021). https://doi.org/10.1007/s10648-020-09540-2

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10648-020-09540-2

Keywords

Navigation