JTE v25n1 - The Evolving Classroom: A Study of Traditional and Technology-Based Instruction in a STEM Classroom

Volume 25, Number 1
Fall 2013


https://doi.org/10.21061/jte.v25i1.a.3

The Evolving Classroom: A Study of Traditional and Technology-Based Instruction in a STEM Classroom

Timothy J. Devlin, Charles R. Feldhaus and Kristin M. Bentrem

One need only read the most recent newspaper, periodical or research journal to realize that there is unprecedented change occurring in education. According to Kimmelman (2006) , since the seminal report A Nation at Risk was published in 1985, the call for education reform has increased dramatically over the last 25 years. During the last ten years, the No Child Left Behind Act of 2001 (NCLB) has ensured that educators at every level focus on accountability, use scientifically-based research, be data driven, and use standardized tests in an effort to improve student learning. Love it or hate it, NCLB has been the catalyst for huge changes in the world of education from kindergarten through 12th grade (K–12). Based on new reform models, some researchers have found that too many students enrolled in K–12 classrooms do not achieve at levels necessary to be globally competitive ( National Center for Education Statistics, 2009 ). Clearly, new policies, expectations, and accountability measures have changed the way teachers teach and students learn.

Any discussion of technology and engineering literacy must start with a clear idea of exactly what technology and engineering literacy means. That, in turn, requires clear definitions of technology. The International Technology Education Association (now the International Technology and Engineering Educators Association) developed the Standards for Technological Literacy: Content for the Study of Technology ( 2000, 2002, & 2007 ), and the definition of technology included in that document was used for purposes of this study:

Broadly speaking, technology is how people modify the natural world to suit their own purposes. From the Greek word techne , meaning art or artifice or craft, technology literally means the act of making or crafting, but more generally it refers to the diverse collection of processes and knowledge that people use to extend human abilities to satisfy human needs and wants. ( 2007, p. 2 )

Some believe that technology is a very effective way of engaging young minds and improving student learning ( Carlson, 2005 ). However, considering the explosion of social media, hand-held technology, and numerous ways for Millennials (the generation born between 1980 and 2000) to get screen time, the fear is that students cannot really focus and multi-task effectively, especially when asked to follow specific instructions. To think creatively, work in teams, and have deep understanding of project-based learning, students must understand that technology is a useful tool but not a replacement for human interaction. There is significantly more information available to be consumed today than in past generations, and Millennials have more ways to consume it than ever before. To say that students do not have the ability to learn, engage, and concentrate greatly underestimates their abilities. Learners have simply grown accustomed to acquiring information and communicating by utilizing technology-based methods ( Moore, 2007 ).

The use of technology has become more prevalent in schools and has been shown to facilitate student learning objectives. According to Gulek and Demirtas (2005) , there is substantial evidence that incorporating technology, of any kind, in the classroom as an instructional tool enhances student learning and educational outcomes. Numerous studies ( Gulek & Demirtas, 2005 ; Spires, Lee, Turner, & Johnson, 2008 ; Edwards, 2007 ) have found that that using any technology with students who are considered Millennials boosted both concentration and engagement. Students who use technology were (a) spending more time involved in collaborative work, (b) participating in more project-based instruction, (c) producing writing of higher quality and greater length, (d) gaining increased access to information, (e) improving research analysis skills, and (f) spending more time doing homework digitally. Studies have also determined that using technology at the beginning of class sessions helped students stay on task and concentrate ( Spires et al., 2008 ). Additional research is necessary to confirm the findings of researchers who have studied the incorporation of technology in student learning, provide new knowledge about how Millennials perceive the world and their learning experiences, and to provide new pathways for classroom teachers who wish to make a difference in the lives of students by using all tools available to them. This study used the action research model developed by Mills (2010) to better understand the effect that technology has on the delivery of instruction for hands-on, project-based Science, Technology, Engineering, and Mathematics (STEM) assignments to middle school students. This research examined the effect on student ability to follow instructions, think critically, and work collaboratively when the instructions given are given directly through in-person communication or pre-recorded video. Technological advancement in the classroom, engaging the millennial student population, and current teaching methods are discussed.

Literature Review

In 2000, the International Technology Education Association established a formal definition for technological literacy: “Technological literacy is the ability to use, manage, assess, and understand technology” ( International Technology Education Association, 2000, p. 7 ). Many authors ( Mentzer & Becker, 2010 ; Gamire & Pearson, 2006 ; Pearson & Young, 2002 ) declare a unifying theme, relative to technological literacy, is that technologically literate people are able to function in our modern technological society. One element that is often contained in the discussion of technological literacy is the concept of technological competence. Autio and Hansen (2002) defined technological competence as an interrelationship between technical abilities in psychomotor, cognitive, and affective areas. Researchers ( Layton, 1994 ; Autio, 2011 ) have also established three components that are considered dimensions of technological competence: 1) technological knowledge is defined as knowing something about technological concepts, principles, and connections as well as the nature and history of technology; 2) technological skill is defined as tactile and kinesthetic ability as well as practical intelligence (often called psychomotor skills); and 3) technological will is defined as being active and enterprising with regard to technology. It is important for technology education faculty, especially those teaching middle school students, to understand that current students have very different knowledge, skill sets, and understanding of both technological literacy and competence. The concepts of technological knowledge, skill, and will must be considered as STEM educators continue efforts to increase student achievement in a reform-based educational environment.

Using technology to engage students has recently become a topic of research, yet there are vast resources available and the literature has grown significantly in the past five years. There is substantial evidence that incorporating technology, of any kind, in the classroom as an instructional tool enhances student learning and educational outcomes. Gulek and Demirtas (2005) provided students with laptops and observed an increase in collaborative work, better research skills, greater quantity and quality of writing, and more time spent doing homework. Caruso and Kvavik (2005) found that students tend to use technology for convenience for both academic and social activities. Additionally, these researchers found that laptop ownership increased by more than 10% from 2004 to 2005. Students who perceive their instructors to be effective users of technology report greater course engagement, more interest in the subject matter, and better understanding of complex concepts ( Caruso & Kvavik, 2005 ).

Using technology in the classroom has a far greater effect than benefiting just the student population. Gulek and Demirtas (2005) report that teachers that incorporate technology in classrooms generally have a constructivist approach to teaching. They also suggest that the use of technology makes teachers feel more empowered in the classroom and consequently spend less time lecturing because their students are involved in critical-thinking based problem solving activities, active learning, and interactions with fellow students.

Currently, K–12 educators are faced with challenges in both technological literacy and competence. One of the greatest tasks facing educators is how to educate and engage students that live in a world of “ubiquitous information and communications-related digital technologies (e.g. web, hand-held devices, cell phones, and gaming consoles)” ( Spires et al., 2008, p. 497 ). McGlynn (2008) believes that student engagement is the key to academic motivation, persistence, and degree completion. Certainly engaged students are more likely to become technologically literate and competent. Researchers from the discipline of technology education ( Koch & Sanders, 2011 ; Jonassen, 2000 ; Todd, 1999 ; Williams, 2000 ) found that engagement is often maximized if students are exposed to hands-on, project-based curriculum that requires them to solve problems. Students who are provided assignments that give them an opportunity to observe, evaluate, communicate, model, generate ideas, research/investigate, produce, and document success/failure are often self-directed and engaged ( Williams, 2000 ; Koch & Sanders, 2011 ).

According to Spires et al., (2008) students want their schools to look more like the world around them. They want items in their environment that inspire and motivate them to learn and achieve. In a recent study, when middle school students were asked to describe their ideal educational environments they described schools that had wireless technology, flexible work environments, and work areas that mimic the workplaces of today ( Edwards, 2007 ). Clearly, some researchers believe that educational institutions at all levels, but particularly in middle school, should focus on creating learning environments that emulate the professional environments in which students may one day work.

Millennials and the Art of Educating the Digital Native

Sometimes referred to in the media as "Generation Y," Millennials are the children of the post-WWII baby boomer generation. The Millennial generation has been immersed in technology from birth and thrives on collaboration. This generation imitates previous generations by displaying the light from their cell phones at concerts where once lighters were held high; they don’t remember Elton John being in the rock and roll genre, and their parents are older than Kermit the frog ( Moore, 2007 ). Yet, many teacher-training programs are centered on industrial models that existed during the mid-twentieth century. These dated educational methods have created frequent misunderstandings, often prepared newly trained teachers to fail, and, perhaps more importantly, impeded educational improvement, advancement, and change ( Woempner, 2010 ).

Millennials have already been pegged and defined by academics, trend spotters, and futurists: They are smart but impatient. They expect results immediately. They carry an arsenal of electronic devices—the more portable the better. Raised amid a barrage of information, they are able to juggle a conversation on Instant Messenger, a Web-surfing session, and an iTunes playlist while reading Twelfth Night for homework. Whether or not they are absorbing the fine points of the play is a matter of debate ( Carlson, 2005, p. A34 ).

Carlson (2005) concludes that Millennials expect to be able to choose what, where, when, and how they learn. Educators should be prepared to include blogs, videos, video games, even handheld devices such as iPads and Blackberries. Although throwing out textbooks and traditional teaching methods might be met with resistance, teachers should understand that “Millennials consume and learn from a wide variety of media, often simultaneously” ( Carlson, 2005 ). McGlynn (2008) believes that the process of reaching these students in order to engage, motivate, and inspire them cannot be ignored. There must be an intersection between how Millennials learn and how educators teach.

The challenge for the educators and technology developers of the future will be to find a way to ensure that this new learning is highly situated, personal, collaborative and long term; in other words, truly learner-centered learning. ( Naismith, Lonsdale, Vavoula, & Sharples, 2004, p. 36 )

Wisniewski (2010) observes two major paradigms in today’s schools: behaviorist and constructivist. Advocates for the behaviorist paradigm believe that the purpose of educators is to transfer knowledge to another in the form of direct instruction and memorization and then to judge effectiveness with a traditional assessment. Efficiency is key, and the transfer of knowledge is time sensitive and normally done through lectures. In contrast, advocates of the constructivist paradigm believe in a very different approach. Constructivists believe that knowledge is built on top of existing knowledge. They also believe in demonstrating real world connections to increase engagement and authenticity. The core belief of this form of education is that students play an active role in constructing new knowledge. The learning is student-centered and the teacher takes on the role of facilitator. Shaw (2009) contrasts these methods in recent research, and the differences between the two methods can be seen in Table 1 (continued next page).

Table 1
20th Century vs. 21st Century Education (Shaw, 2009)
20th Century Classroom 21st Century Classroom
Time-based Outcome-based
Focus on memorization of discrete facts Focus on what students know and can do
Lessons focus on the lower levels of Bloom’s taxonomy—knowledge, comprehension, and application Lessons emphasize upper levels of Bloom’s taxonomy—synthesis, analysis, and evaluation
Textbook-driven Research-driven
Passive learning Active learning
Learners work in isolation Learners work collaboratively with classmates and others around the world
Teacher-centered: teacher is center of attention and provider of information Student-centered: teacher is facilitator/coach
Fragmented Curriculum Integrated and interdisciplinary curriculum
Teacher is judge. No one else sees student work Self, peer, and authentic assessments
Curriculum/Scholl is irrelevant and meaningless to the students Curriculum is connected to students' interests, experiences, talents, and the real world.
Print is the primary vehicle of learning and assessment Performance, projects, and multiple forms of media are used to learning and assessment
Literacy is the 3 R’s—reading, writing, and math Multiple literacies of the 21st century—aligned to living and working in a globalized new millennium

Most schools are involved in a paradigm shift as they move away from traditional methods and more toward a constructivist approach ( Wisniewski, 2010 ). Studies show that today’s college graduates have spent less than 5,000 hours of their lives reading text, while they have spent over 10,000 hours playing video games and 20,000 hours watching television ( Prensky, 2001 ). This generation has been characterized as digital natives ( Prensky, 2001 ). A digital native is defined as a person that has grown up immersed in technology and often has the characteristics seen in Table 2. The educators that are trying to engage students are considered digital immigrants , and often they are learning digital technology as if it was a second language ( Prensky, 2001 ).

Table 2
Digital Native Characteristics ( Prensky, 2001 )
Grew up with technology Function best when networked
Parallel process and multi-task Thrive on instant gratification and frequent rewards
Prefer graphics before text
Prefer random access Expect adults to consult and include them

Generational misunderstandings regarding technology use can hamper communication. However, if the digital immigrant generation would utilize technology in the same way as the Millennials, it would break down communication barriers and ultimately benefit the educational process. Educators need to modernize their methods and ignore their generational preferences if they truly want to engage every student ( Woempner, 2007 ).

Methodology

According to Mills (2010) , action research is any systematic inquiry conducted by teacher researchers, principals, school counselors or other stakeholders in the teaching/learning environment to gather information about how their schools operate, how they teach and how well their students learn. In short, action research is done by teachers for themselves. Mills (2010) recommends appropriate methods to collect data in action research, and his five steps of inquiry were used to conduct this action research investigation: (1) identification of problem, (2) collection and organization of data, (3) interpretation of data, (4) action based on data, and (5) reflection.

The problem was identified as lack of focus and inability of middle school students to follow instructions at the beginning of class during courses taught at a high-needs urban middle school of over 1,100 students. A concurrent triangulation mixed method action research design was then developed based on questions for mixed methods study created by Creswell (2009) .

In a concurrent triangulation approach, the researcher collects both qualitative and quantitative data concurrently and then compares databases to determine if there is convergence, differences, or some combination [of the two]. Some authors refer to this as confirmation, disconfirmation cross-validation, or corroboration ( Green, Caracelli, & Graham, 1989 ; Morgan, 1998 ; p. 213)

This traditional mixed methods model is advantageous to action researchers because it “can result in well-validated and substantiated findings” ( Creswell, 2009, p. 213–214 ).

Materials and Procedure

The key elements of this study included a hands-on, problem-solving, STEM activity, written instructions either read by a classroom teacher or delivered through video, an observation checklist, a two-question survey, and seven interview questions asked of a focus group. Table 3 provides a synopsis of the STEM activity and the directions provided to students. The survey questions (Appendix A) asked students to specify how instructions where given to them and then to rate their ability to understand the instructions on a scale from 1–10. The interview questions (Appendix B) asked the students to describe the instructions they were given and to describe their perceptions of the of the instructional delivery method.

Table 3
STEM Activity: Instructions Delivered to Students

  1. Work in groups of three.
    • If an uneven amount of students, then form two groups of two
  2. Work together to use the paper and the tape placed on their desks to design a structure that could hold a regular textbook 10” above the table.
    • The lowest part of the textbook and distance to the table had to be at least 10”
    • Hold as many books as possible
    • Do not ask any questions about the assignment
  3. All groups of 2 or 3 received the following materials:
    • Four sheets of 8.5” by 11” sheets of regular computer paper
    • Six inches of masking tape
    • A pair of scissors
    • A ruler

To ensure the instructions were as similar as possible, a bulleted list of criteria that needed to be covered was created (Appendix C). The video instructions were filmed with a Canon Rebel T2i video recorder. The video was edited in Adobe Premiere and included some additional materials such as music and text that emphasized the instructions. The video was uploaded to YouTube and can be found at http://www.youtube.com/watch?v=nk4v6xEYN0s .

On the first day of class, a group of students who were new to both the course and teacher came into the classroom and a PowerPoint was displayed with instructions to prepare a nametag and await instructions (Appendix D). Once the nametags were prepared, the class was greeted and attendance was taken. Each class was told that in a few moments, there would be instructions given to the entire class and that follow-up questions would not be permitted. Three of six classes were shown the video, and the other three were given verbal instructions and shown a PowerPoint presentation.

After the instructions were given, the instructor stayed in front of the class until groups were formed. Then, each group was given the materials as outlined in Table 3. Researchers utilized an observation sheet to chart student behaviors (Appendix E). Behaviors observed included the number of questions asked, number of non-three-person groups formed, and if instructions were followed. Each group was given 30 minutes to complete the challenge.

At the conclusion of the challenge, each participant was given a brief survey (Appendix A). One class from each instructional delivery method was invited to remain for a focus-group interview where pizza was served. Twenty-two students were interviewed and served as a focus group for the various delivery methods. The focus group interactions and responses were recorded with an iPhone and were later transcribed.

Participants

Participants were self-selected by enrolling in the class in which this research took place. No students who elected to take these classes were excluded. Class sizes were predetermined and unaltered for this study, and participants were not excluded on the basis of ethnicity, gender, or learning ability. It was assumed that participants fairly represent the entire student body because they were obtained from the preexisting class rosters and the classes were open to the entire middle school population. Internal procedures for classroom action research as outlined by the policy manual of the school corporation were followed, and the entire experiment was reviewed and approved by the university institutional review board

Six classes were utilized for this study. Three classes, consisting of fifty students, received video instruction and three classes, consisting of thirty-seven students, received in-person instruction (Table 4).

Table 4
Participant Numbers by Class and Instruction Method Utilized
Class Students Instruction Medium
1 18 Video
2 14 Verbal
3 24 Video
4 8 Verbal
5 8 Video
6 15 Verbal

Results

Three data collection methods were used to collect both qualitative and quantitative data for this study. A survey was provided to all students who participated in the exercise, a selected focus group of participants were interviewed, and an observation checklist was used in an effort to triangulate data. Survey data and results from the observation checklist were designed to provide quantitative data, and focus group interviews were designed to provide participant perceptions of the experience and qualitative data. The results of each method of data collection are included, and various appendices are provided so readers have access to instrumentation.

Survey and Focus Group Interview Results and Comparisons

Following the activity, all participants were given a survey (Appendix A). They were asked to rate, on a scale from 1–10, their ability to follow the instructions that were given. Of the students that received video instruction, 86.58% stated that they could follow the instructions provided. Of students that received in-person instruction, 85.07% stated that they could follow the instructions.

Findings reveal differences in the ability of participants to follow instruction based on the instruction-delivery method. The percentage of participant-groups that completed the activity differed by 1.51%. However, during the activity, it was observed that many students in both classes were watching other groups and then troubleshooting their own design to resemble the groups that were successful.

This image is a bar graph showing the average number of questions asked per class. Questions asked for verbal instruction shows average of 16 while questions asked for video instruction shows average of 10
Figure 1. Average number of questions asked per class.

When observing the number of questions each group tried to ask the teacher, a significant difference between groups emerged, as seen in Figure 1. The participants that were given the video instructions asked an average of ten questions, while the classes that were given in-person instructions asked an average of sixteen. Additionally, students in the video-instruction group asked if they were allowed to make statements as opposed to asking questions. It was observed that students who received video-based instruction were more likely to correct each other when they started to ask a question than those students that received in-person instruction.

There was also a difference in the number of participants that either asked their own groups if they could get more materials, tried to take more materials, or asked the instructor for more materials, as seen in Figure 2. An average of four groups from the participants given in-person instructions tried to acquire more materials, while only an average of two groups who received video-based instruction asked for more materials. One group that was given verbal instructions tried to organize the class into one large group so they could share all of their materials. The participants that received video-based instruction were less likely to request or try to acquire more materials for their project.

This image is a bar graph showing the amount of requests for different materials. Amount of requests for different materials for verbal instruction shows average of four while amount of requests for different materials for video instruction shows average of 2
Figure 2. Amount of requests for different materials.

As part of the instructions given, group size was also observed and noted as to number of participants in each group (Figure 3). The participants shown video-based instructions were more likely to follow the instructions, while the participants shown the in-person instructions had an average of two student-groups that did not follow instructions. Because it was the first day of classes, some students arrived after the instructions were given

This image is a bar graph showing the groups formed not containing two or three members. Groups formed not containing two or three members for verbal instruction shows average of two while groups formed not containing two or three members for video instruction shows average of 0
Figure 3. Groups formed not containing two or three members.

The classes that were shown the video instructions were able to explain to the newcomers that they had to find a group that contained only two students. It was observed that if a group with four participants formed, those students were more likely to disengage earlier than other groups. Observations found that participants given video-based instruction were more likely to remain engaged when compared to participants given in-person instruction. It was also noted that students given video-based instruction were more likely to accurately follow the instructions than groups that were given in-person instruction.

Focus Group Interview Results

Question #1. Participants were asked various questions (Appendix B), and a variety of sub-questions, during the focus-group interview. Student responses revealed a difference in the ability to reflect on the instructions depending on how instructions were delivered. Participants were asked to describe the instructions they were given. Students who received in-person instructions offered a brief recollection of the instructions that were given. They were not able to reflect on or offer any comments to describe the instructions. The only response other than reciting the instructions came from one student that said, “The instructions were easy.” Some responses included erroneous information such as “We were told we couldn’t ask questions” and “We were given four pieces of paper and three pieces of tape and we had to hold a book six inches off the table.”

In contrast, when asked to describe the instructions, students who received video-based instruction offered a greater amount of reflection. Students perceived the video-based instructions to be easier, more likely to be understood, and easier to recall. Participants observed that they were more likely to pay attention because it was a video and not a teacher. One student commented on attention span stating that participants had to concentrate on the video, rather than being able to talk to each other. Most students described a sort of novelty to the video instructions, which resulted in their own observation of higher engagement. One participant said, “Our instructions were given over video, which I thought was pretty cool… you have to push yourself to pay attention, it pushes you to remember.” Additionally, some participants felt that “kids will focus on (video-based instructions) more than when a teacher gives them.” Another student simply stated, “It is harder to concentrate when a teacher is talking.”

Question #2. Next, participants were asked to describe what they liked most and least about the instructions. Participants given the in-person instruction were less responsive to this question. Student comments include having an increased understanding of the instructions “because it was from [the teacher] and not the computers” and “that [delivery of the instruction] wasn’t going too fast or too slow.” Other students felt that the instructions were straightforward and questions did not need to be asked. The less reflective nature of their answers could be a result of the students being given instructions in the same manner that they are used to receiving them, and students shown video-based instructions had something to compare to the status quo instruction delivery methods.

Participant responses reinforced the novelty of using video and, correspondingly, increased student engagement. One student said that the “first thing I liked was that it was on a video, I had never seen that before,” and another felt that “it was a new way to understand things and it made me understand them a lot more.” Other comments from students included general observations such as “the video was more fun” and “cool.” Students felt that video-based instruction “[got] to what you need to do quicker so you [could] do better than what the teacher said.” Participants also felt that teachers were apt to provide extraneous information saying, “I liked how the video was to the point… the extra stuff confuses me” and “[the video is] easier because when a teacher talks it takes way too long.” Contrary to these comments, another student thought that “Teachers leave out parts of the instructions [and that watching a] video makes it more simple.”

Participant comments reinforced their trust in technology communication and were further emphasized because the instructions that were given in-person were the same as video-based instruction. Additionally, students perceived the video as more informational than verbal instructions even though the teacher provided additional examples in the in-person instruction. One student was able to succinctly sum up this general feeling by saying, “In a video you are shown instead of told.”

When asked what they liked least, both groups of students sighted frustration with not being able to ask questions. Although this was the limit of complaints for the video-based instruction group, the in-person instruction group was more prolific and varied in their responses, oftentimes contradicting each other. One student responded with, “I would prefer the teacher [in-person] over a YouTube video, unless I was in a big class then I would want videos.” One participant suggests that “it would have been easier watching a video because you could have answered more questions” and “video can show better examples,” while another suggests the opposite, “it’s easier to understand the person than to watch the movie, even if you play it back over and over.”

Question #3. Some respondents recognized that a teacher could use video-based methods to complement instruction effectiveness and aid in student understanding. Students displayed an appreciation for in-person instruction by explaining that a teacher can modify their instructions, but a video can only repeat the pre-recorded instructions. Participants cited potential barriers to effective video instruction included audio issues, lack of understanding, and not being shown how to complete the project. Students who received video-based instruction communicated their understanding that technology is not always reliable and a teacher can always supply more instruction.

One student, extended beyond their comfort zone, stated, “(I) had to ask my group more questions, and I don’t do that much.” Another student spoke to group collaboration, “Afterwards, I couldn’t figure out what to do, I couldn’t ask questions so I had to stick to my group.” Although provided as “negative” evidence of the video-based instruction, these statements are quite positive in providing both social and cognitive problem solving development. “The video helped us know who we were working with, [it] helped us know the other people,” as said by one student. Many listed not being able to ask questions as a barrier. However, students shown the video-based instructions seemed to consistently say that the instructions forced them to work more collaboratively and think more critically.

Question #4. Participants were then asked how they overcame the barriers they described. Both groups expressed that they were more likely to look to their groups in order to have their questions answered. However, their main reasoning for this was not because of the instructions they were given but as a result of them not being able to ask any questions. In support of this finding, one group responded, “If no one on the team knew, we would keep thinking [until we] figure it out.”

Question #5. Finally, both groups were asked to provide any comments regarding the delivered instruction. Some general negative responses received from the in-person instructions group included “[in-person] instructions stay in my head easier, while videos I can just zone out” and “sometimes videos are a little more confusing.” Positive comments from this group included “I liked how you were very straight to the point; you told it how it is” and “It was fun being forced to figure it out.” Another student thought that a “teacher would have more time to help” than just watching a video.

The student responses indicated that in-person instruction could be more confusing if the teacher elaborated. Students also thought that videos could be fun and interesting in some situations. Some participants were in favor of a multifaceted approach that utilized both in-person and video-based instruction stating that “It would be cool to have both” and “I think it would help to have both video and a teacher.”

Students who preferred the video-instruction felt that “it was a lot better than sitting down and watching a teacher talk” and that “watching a video made me more interested to get [my work] done.” One participant who felt more comfortable with the video stated, “I would definitely watch a video before asking a teacher [a question].” One student interested in skill development suggested, “It made me work more with a group than I normally do and that is something I need to do more.” The researchers found that some students perceived that the instruction was better simply because it was presented using video; “I think they were better instructions because they were on a video.”

Students all agreed that the class could not take place without the teacher, and they expressed their appreciation for a teacher’s ability to incorporate videos into instruction. Student perceptions seemed to agree with Caruso and Kvavik’s (2005) findings that teachers who possess the ability to incorporate technology affect higher student engagement, more student interest, and greater student understanding.

An analysis of the student responses from transcribed focus group sessions revealed numerous common themes. Student focus group common themes are detailed in Table 5.

Table 5
Findings/Common Themes
Student Focus Group Common Themes Video-Based In-Person
Critical thinking demonstrated through responses. X
More responsive answers. X
Perceived instructions novelty of technology. X
Perceived instructions were informative. X
Perceived instruction aided understanding. X
Perceived instructions were reliable. X
Perceived instructions helped critical thinking X
Perceived instructions helped collaborative work X
Perceived instructor could help facilitate understanding X X
Perceived extraneous information could be confusing X

Conclusions, Recommendations, and Lessons Learned

The students that answered the questionnaire perceived no difference in understanding the instructions. However, participant perception and action did not equate because when comparing the video groups to the in-person groups, there was a significant difference in the amount of correct groups formed and the amount of questions that were asked. Students all perceived that they understood the instructions given, regardless of method of delivery, but their actions (forming incorrect groups, asking questions, etc.) indicate otherwise. According to the results of the observation checklist, students who received instructions via video seemed more engaged in the activity and were able to stay engaged without asking questions. Also, they completed the project to specification in the time allotted. Students who received instructions for the exact same assignment from the teacher were less likely to exhibit those behaviors. Additional research should be completed with larger sample sizes to determine if delivery of instructions for complex, hands-on, project-based STEM activities result in increased ability to follow instructions, think critically, maintain focus, and complete tasks to specification. Additionally, variables such as learning style, gender, race, geographic location, and age should be taken into account in these studies.

The students that saw the video were more reflective on the instructions they were given. Many students that were given verbal instructions were very slow to recall their feelings and perceptions of the event. The students given the video instructions seemed more excited about the project and declared that the video encouraged them to work in their groups more and inspired them to critically think. Based on the findings it is apparent that the application of video-based to in-person instruction can be beneficial to student engagement and learning. However, more observations need to be done on a grander scale to verify these findings. Research using other social media methods such as Facebook, Twitter, and Skype should be developed to determine if these media sources enhance or detract from the ability of middle school students to follow instructions and perform tasks. Additionally, studies should be performed to determine other factors that might influence middle grade student perceptions of the most efficient and beneficial ways to receive instructions. Various issues attributed to Millenials such as rapid, free-choice, random access to data, information, and resources should also be examined by learning science researchers.

To expand upon this study, various and different types of video instruction should be prepared and delivered so that educators can assess benefits on student learning. In this study, both qualitative and quantitative data analysis revealed that participants who received initial instructions via video displayed a higher level of engagement, but it must be conclusively determined if that can be attributed to the utilization of technology and, if so, what types of technology. Future work could include investigating the instructor’s willingness to answer questions and the resulting effect on student critical thinking skills. It was found that students were less likely to ask questions when given video-instruction instead of instructions given in person. Students who received video-based instruction were more likely to adhere to instructions and utilize critical thinking skills more often. These outcomes support findings that students in classes where teachers utilize more information technology will report more engagement, more interest in the subject matter, and a better understanding of complex subjects. But the real question is why?

Although this research indicates that video instruction better engaged students, once uniformly implemented, the novelty of this instruction delivery method may diminish. Given that technology and its accessibility continue to advance and change, educators will be challenged not only to keep pace but also to ensure that they stay abreast of the latest technological developments and use them as learning tools to reach Millenials. This research suggests that educators must keep technological pace with students and diversify their teaching methods in order to keep students engaged. Additional studies could focus on the adaptability of teachers who educate utilizing behaviorist methodology and those who tend to use constructivist methods.

This small study came from a realization by the authors that students were not paying attention and engaging in class assignments. This often caused discipline problems and a sense of chaos in this urban, middle school classroom. The author came to realize that the first minutes of class are crucial to student engagement and set the tone for the rest of the class period. As a result, this mixed methods action research study was designed to not only inform practice for other technology and engineering faculty who teach Millenials but also to inform the author. Clearly, the findings of the study are not generalizable; however, in keeping with the traditions of action research, the author did learn lessons that will impact action and reflection. Those lessons include:

  1. The use of technology, in this case YouTube, may indeed play a role in engaging middle school students during the first, crucial minutes of an engineering design assignment.
  2. Students perceived that the use of YouTube technology to deliver assignment instructions was beneficial and helped them better understand the requirements of the assignment and focus on what was required for a successful outcome.
  3. This author will continue to use various social media to engage students in a variety of ways.

Hopefully, more action research will be conducted on the issues revealed in this study and a body of knowledge will be constructed. If high quality teaching in an urban middle school setting is to flourish, teachers must try all methods possible to engage students. This study shows promising results to that end.


Timothy J. Devlin ( tim.devlin24@gmail.com ) is a former Technology and Engineering Teacher at Lynhurst Middle School in Indianapolis, Indiana, Charles R. Feldhaus ( cfeldhau@iupui.edu ) is Associate Professor and Co-Director of the STEM Education Research Institute (SERI) at Indiana University Purdue University Indianapolis, and Kristin M. Bentrem ( bentrem@iupui.edu ) is a Ph.D. student in the Higher Education and Administration program at Indiana University Purdue University Indianapolis

References

Autio, O., & Hansen, R. (2002). Defining and measuring technical thinking: Students technical abilities in Finnish comprehensive schools. Journal of Technology Education, 14 (1), 5–19.

Autio, O. (2011). The development of technological competence from adolescence to adulthood. The Journal of Technology Education, 22 (2), 71–89.

Carlson, S. (2005). The net generation goes to college. The Chronicle of Higher Education, 52 (7), A34–A37.

Caruso, J. B, & Kvavik, R. B. (2005). ECAR study of students and information technology, 2005: Convenience, connection, control, and learning roadmap. Retrieved from http://net.educause.edu/ir/library/pdf/ECM/ECM0506.pdf

Creswell, J. W. (2009). Research design: Qualitative, quantitative, and mixed methods approaches. Los Angeles, CA: Sage Publishing.

Edwards, O. (2007, March). High tech high: A learning environment steeped in technology. Edutopia. Retrieved from http://www.edutopia.org/high-tech-high

Gamire, E., & Pearson, G. (Eds.). (2006). Tech tally: Approaches to assessing technological literacy. Washington, DC: National Academies Press.

Green, J.C., Caracelli, V. J., & Graham, W. F. (1989) Toward a conceptual framework for mixed-method evaluation designs. Educational Evaluation and Policy Analysis, 11 (3), 255–274.

Gulek, J., & Demirtas, H. (2005). Learning with technology: The impact of laptop use on student achievement. Journal of Technology, Learning, and Assessment, 3 (2), 3–6.

International Technology and Engineering Educators Association (ITEA/ITEEA). (2000, 2002, and 2007). Standards for technological literacy: Content for the study of technology. Reston, VA: Author

Kimmelman, Paul, L. (2006). Implementing NCLB: Creating a knowledge framework to support school improvement. Thousand Oaks, CA: Corwin Press.

Koch, D., & Sanders, M. (2011). The effects of solid modeling and visualization on technical problem solving. Journal of Technology Education, 22 (2), 3–21.

Jonassen, D. H. (2000). Toward a design theory of problem solving. Educational Technology Research and Development, 48 (4), 63–85.

Layton, D. (1994). A school subject in the making? The search for fundamentals. In D. Layton (Ed.), Innovations in science and technology education (vol. 5). Paris: Unesco.

McGlynn, A. P. (2008). Millennials in college: How do we motivate them? The Hispanic Outlook in Higher Education, 17, 34–36.

Mentzer, N., & Becker, K. (2010). Academic preparedness as a predictor of achievement in an engineering design course. Journal of Technology Education, 22 (1), 22–42.

Mills, G. E. (2010). Action research: A guide for the teacher researcher. New York, NY: Pearson Publishing.

Moore, A. (2007). They’ve never taken a swim and thought about Jaws: Understanding the Millennial Generation. College and University, 82, 41–48.

Morgan, D. (2007). Paradigms lost and pragmatism regained: Methodological implications of combining qualitative and quantitative methods. Journal of Mixed Methods Research, 1 (1). 48–76.

Naismith, L., Lonsdale, P., Vavoula, G., & Sharples, M. (2004). Literature review in mobile technologies and learning (Report No. 11). Bristol, UK: Futurelab.

National Center for Education Statistics. (2009). Learner Outcomes. Retrieved from http://nces.ed.gov/quicktables/result.asp?SrchKeyword=&topic=Elementary%2FSecondary&Year=2009

No Child Left Behind (NCLB) Act of 2001, Pub. L. No. 107–110, § 115, Stat. 1425 (2002). Retrieved from http://www2.ed.gov/policy/elsec/leg/esea02/index.html

Pearson, G., & Young, A. T. (Eds.). (2002). Technically speaking: Why all Americans need to know more about technology. Washington, DC: National Academy of Engineering.

Prensky, M. (2001). Digital natives, digital immigrants. On the Horizon, 9, 1–6.

Shaw, A. (2009). Education in the 21st Century. Journal of Social Education Victoria, 17, 11–17.

Spires, H. A., Lee, J. K., Turner, K. A., & Johnson, J. (2008). Having our say: Middle school perspectives on school, technology, and academic engagement. Journal of Research on Technology in Education, 40, 497–515.

Todd, R. (1999). Design and technology yields a new paradigm for elementary schooling. Journal of Technology Studies, 25 (2), 26–33.

Williams, P. J. (2000). Design: The only methodology of technology? Journal of Technology Education, 7 (2), 55–71.

Wisniewski, M. A. (2010). Leadership and the Millennials: Transferring today’s technological teens into tomorrow’s leaders. Journal of Leadership Education, 9 (1), 53–68.

Woempner, C. (2007). Teaching the next generation. Mid-continent Research for Education and Learning, 1-5.


Appendix A

Post-activity survey questions.
Thank you for completing this activity. Please answer the following questions as accurately as possible.
1. Please circle whether you were given instructions by video or from your teacher.

Video or Teacher

2. Rate from 1 to 10 your ability to understand the directions that were given to you.

1 2 3 4 5 6 7 8 9 10


Appendix B

Interview Questions after the Activity

  1. Describe the instructions you were given during class.
  2. What did you like best and least about how your instructions were delivered?
  3. What were the benefits and barriers of the way your instructions delivered?
  4. How did you overcome the barriers you describe above?
  5. What other comments do you have regarding how the instructions were delivered?

Appendix C

List of Criteria for Instructions.

  • No questions will be asked
  • Only the following supplies:
    • 4 sheets of 8.5” by 11” computer paper
    • 6” of masking tape
  • No additional supplies will be given
  • Ruler and scissor may be used to measure and cut but nothing else.
  • Must work in groups of 3, if there are more than form 2 groups of 2
  • You will have thirty minutes to complete activity.
  • If you hold one your challenge is to hold as many as possible.

Appendix D

PowerPoint Presentation for Verbal Instruction

Welcome to Tech ED!


Please come in and get a yellow name tag.

Fold this in half and

write your name on both sides.


Then please find a seat


Today's Activity

  • You may not ask questions
  • You will only be given the supplies once, you may not receive any more.
  • You need to work in groups of three.

Today's Activity

  • Your challenge is as follows:
    • Construct something that will hold a textbook at least 10" off the table by using the supplies provided. It can be higher than 10" but no lower.

  • Supplies
    • 4 pieces of 8.5" × 11" computer paper and 6" of masking tape
    • You will only be given the supplies once, you may not receive anymore.
    • You may use a ruler for measuring and scissors for cutting but nothing else may be used for your structure.

  • You need to work in groups of 3. If there is a group larger than 3, then form two groups of 2.


Today's Activity

  • You have 30 minutes to complete this challenge.
  • If you hold one book up your challenge is to hold as many as possible


Appendix E

Observation Checklist
How many students tried formed groups of numbers other than three?

How many students tried to use other materials besides the provided?

How many questions were asked after the presentation?

How many groups completed the task in the amount of time provided?

General observations: