Skip to main content

Exploring critical components of an integrated STEM curriculum: an application of the innovation implementation framework

Abstract

Background

Increased emphasis on accountability in education reform and evidence-based practices underscores the need for research on the implementation of K-12 curricular innovations. However, detailed accounts of research examining fidelity of implementation for K-12 STEM curricula remain relatively scarce. This paper illustrates the application of one frequently cited framework for exploring fidelity of implementation, the innovation implementation framework. The paper describes how this framework was applied to identify and describe the implementation of critical components of a newly developed middle school STEM curriculum.

Results

Drawing on classroom observations, student interviews, and teacher interviews, the paper provides illustrative findings and practical examples of methodology and instruments employed over the course of a 2-year study of curriculum implementation. The paper discusses three ways in which the innovation implementation framework enhanced our understanding of curriculum implementation: specifying critical components of the curriculum and their enactment, informing instrument design and data collection, and revealing implementation patterns.

Conclusions

This paper provides support for the use of the innovation implementation framework to study the implementation of curricula developed within the context of research-practice partnerships. In addition to illustrating the application of the innovation implementation framework, the paper extends previous implementation work focused on efficacy and effectiveness studies to demonstrate the practical advantages of studying fidelity of implementation in the context of design and development projects.

With growing interest in curricular innovations in K-12 STEM (Science, Technology, Engineering, and Mathematics) settings, researchers have argued for the careful study of curriculum implementation (Fishman, Marx, Best, and Tal, 2003; Penuel, Fishman, Haugan Cheng, and Sabelli, 2011; Ruiz-Primo, 2006; Schneider, Krajcik, and Blumenfeld, 2005). Implementation research can serve a number of important purposes including documenting the degree to which an intervention is enacted as intended, allowing for more nuanced understandings of the outcomes of an intervention, informing refinements to curriculum and teacher professional development, and providing invaluable information about the conditions under which curricular innovations are likely to be successful when scaled to a broader population of teachers and students (Century and Cassata, 2016; Ruiz-Primo, 2006). As curricula introducing new approaches to STEM education are designed, there is a clear need to explore the degree to which implementation resembles what was envisioned by curriculum designers and to develop understandings of the various factors that may influence how a curricular innovation unfolds when used by students and teachers in actual classrooms (Cassata, Kim, and Century, 2015).

Fidelity of implementation

Fidelity of implementation (FOI), defined by Rudnick, Freeman, and Century (2012) as “the extent to which an enacted program is consistent with the intended program model” (p. 347), has been an active area of research since the late 1970s (e.g., Sechreset, West, Phillips, Redner, and Yeaton, 1979). Often utilized as an approach to evaluating interventions in health care and public health settings, some have argued that FOI has been under-utilized within educational research (Lee, Penfield, and Maerten-Rivera, 2009; O’Donnell, 2008). While a few early studies of K-12 curricular interventions exist (e.g., Fullan and Pomfret, 1977; Kimpston, 1985), FOI has only been a focus in educational research since the late 1990s and early 2000s. When FOI is examined, it is often considered secondary to a larger study rather than a major focus of educational research (Century, Freeman, and Rudnick, 2008). However, increased emphasis on accountability in education reform and evidence-based practices has fueled a need for additional research focusing more explicitly on FOI for K-12 curricular innovations (Furtak et al. 2008; Lendrum and Humphrey, 2012; O’Donnell, 2008). For example, major funding agencies like the National Science Foundation now call for the measurement of FOI when conducting impact studies investigating intervention outcomes and, in recent Education Research solicitations, have explicitly advocated for measuring implementation starting at the Development and Innovation stage of research (Institute for Education Sciences, 2013, 2020).

Given this increased interest in research exploring the implementation of K-12 education interventions, over the past decade, researchers have begun to investigate FOI in the specific context of STEM education interventions (Barker, Nugent, and Grandgenett, 2014; Borrego, Cutler, Prince, Henderson, and Froyd, 2013; Buxton et al., 2015; Castro Superfine, Marshall, and Kelso, 2015; Johnson, Pas, Bradshaw, and Ialongo, 2018; Lee et al. 2009; McNeill, Marco-Bujosa, González-Howard, and Loper, 2018; Schneider et al. 2005; Songer and Gotwals, 2005). One goal of this research has been to examine relationships between FOI and student achievement results. For example, Songer and Gotwals (2005) investigated the FOI of three inquiry-based science curricular units and found larger achievement gains among students taught by “high-fidelity” teachers than among students taught in classrooms with lower FOI. Other researchers have sought to investigate FOI as it relates to curriculum development, with the goal of using fidelity criteria at various stages of the development process to inform curriculum refinement (Lee and Chue, 2013; Schneider et al. 2005). Schneider et al. (2005) compared teachers’ actual enactment of an inquiry-based science unit on force and motion to intended implementation of the curriculum and found that teachers’ enactment ratings tended to be less consistent with the intended curriculum for the most challenging portions of the unit. This finding suggested the need to enhance educative curriculum materials and to supplement these curriculum materials with professional development and, ultimately, systemic changes to support teacher enactment of reform-based science curricula. Lee and Chue (2013) investigated FOI of a school-based science curriculum for eighth-grade students in Singapore, with the goal of exploring “how FOI concepts can organize efforts to maintain treatment fidelity and thereby serve curriculum development in science” (p. 2510). Analysis of various FOI criteria (dosage, adherence, quality of delivery, and participant responsiveness) (Dane and Schneider, 1998) indicated a relatively high level of fidelity while also identifying specific challenges teachers encountered as they implemented the student-centered curriculum within a relatively traditional learning environment.

A recent study of FOI for a middle-school curriculum (McNeill et al. 2018) focused on the scientific practice of argumentation and compared teachers’ fidelity related to curriculum procedures (e.g., adherence to the order and types of procedures) and fidelity to the curriculum’s overarching goals for argumentation. In this study, analyses of video data collected in five teachers’ classrooms during the implementation of argument lessons informed case studies describing distinct curriculum enactment patterns. For example, the study reports one case in which a teacher demonstrated high fidelity to the curriculum’s overarching goals but low fidelity for procedures due to the adaptations the teacher made to procedures in order to provide linguistic supports for English Language Learners. Given these findings, the authors argue that examining the extent to which teachers advance the overarching goals of curricula may be a better indicator of whether teachers’ enactment of the curriculum is supporting diverse students than strict fidelity to curriculum procedures.

Studies exploring the fidelity of implementation for K-12 engineering education interventions remain relatively scarce. Research exploring the defining characteristics of STEM high schools identified the engineering design process as one of many critical components of STEM high schools (LaForce et al. 2016). This study noted that although the engineering design process was not identified as a critical component for many of the schools in the study, when the EDP was part of a STEM high school’s model, it was a particularly important component. Although this study provides an interesting insight into the integration of engineering within STEM high schools, it does not explore the actual FOI for particular approaches to engineering education within these settings. When research does examine the fidelity of implementation for engineering interventions, the full methods and results of this work are generally not described in detail. A notable exception is a recent research focused on the popular Engineering is Elementary (EiE) curriculum, which describes examples of teachers implementing the curriculum with fidelity (Gruber-Hine, 2018) and describes the use of implementation logs to examine the fidelity of implementation (Lachapelle and Cunningham, 2019). Other than these examples, within the extant engineering education literature, we were not able to identify major studies describing the frameworks or systematic methods used to evaluate the fidelity of implementation for K-12 engineering education programs.

Although similarly rare, there are some examples of fidelity of implementation for engineering programs within higher education. Borrego et al. (2013) explored the FOI of Research-based instructional strategies (RBIS) in undergraduate engineering science courses (e.g., statics, thermodynamics). The study focused on 11 research-based instructional strategies for which there had been documented use in engineering settings and evidence of positive influence on student learning (e.g., case-based teaching, just-in-time teaching, inquiry learning, collaborative learning, problem-based learning). The study’s survey of 387 faculty members indicated a wide range of RBIS implementation, with between 11% and 80% of faculty reporting that they spent time on required components of the RBIS. The study also compared respondents who used the RBIS to those who did not and found that 13 of the critical components discriminated between users and nonusers.

In addition to describing the implementation of particular interventions, extant research provides some guidance on how to conceptualize and study the implementation of STEM interventions (Buxton et al. 2015; Dane and Schneider, 1998; Dusenbury, Brannigan, Falco, and Hansen, 2003; McNeill et al. 2018; Mowbray, Holter, Teague, and Bybee, 2003; Ruiz-Primo, 2006). Buxton and colleagues draw on practice theory to reframe implementation in terms of “multiplicities of enactment” versus program adherence and FOI. Findings from their 3-year project focused on professional learning among middle school science teachers illustrate how individual teachers exercised agency to enact lessons in a variety of ways, depending on a range of personal and contextual factors. Similarly, a number of researchers have focused on describing teachers’ principled adaptations of curriculum (Borko and Klingner, 2013; Debarger et al., 2017; Singer, Krajcik, Marx, and Clay-Chambers, 2000), noting the virtual impossibility of enacting programs exactly as intended and the potential benefits of teachers making intentional changes to interventions in order to accommodate their local conditions and students.

Drawing on previous FOI research and science education literature, Ruiz-Primo (2006) describes a multi-faceted approach to investigating FOI for inquiry-based science curricula that involves attending to an array of elements including the types of curriculum (intended, enacted, and achieved), various dimensions of curricula (theoretical stand, curriculum materials, and instructional transactions), and a number of aspects of measuring fidelity of implementation (adherence, exposure, quality of curriculum enactment, student responsiveness, and curriculum differentiation). Following the articulation of this approach, the paper illustrates its application with a study triangulating an array of data sources to examine FOI of the Foundational Approaches in Science Teaching (FAST) middle-school science curriculum. Commenting on the utility of the approach, Ruiz-Primo (2006) states “information from the diverse instruments that we have developed has given us a portrait of the diverse ways in which FAST teachers are implementing the curriculum and how these forms of curriculum enactment affect student learning” (p. 38). Ruiz-Primo then offers a number of lessons learned through their study of FOI, noting the importance of clarity when defining critical components and specifying which variations in implementation will be considered minor or major, the advantages of planning FOI studies in parallel with initial program development, the necessity of observation data (direct observation or video), and the importance of using multiple methods and multiple sources to develop evidence of FOI.

Despite the increased prevalence of research on curriculum implementation, reviews of research on FOI have noted a lack of systematic measurement procedures and frameworks that can be applied to study the implementation of curricular interventions (Century, Rudnick, and Freeman, 2010; Lee and Chue, 2013; O’Donnell, 2008; Ruiz-Primo, 2006). Century et al. (2010) refer to the fidelity of implementation as “the black box” of evaluating interventions, noting that researchers have struggled to develop a shared conceptual understanding of FOI and how to measure it:

We create FOI measures based on their particular contexts and programs of interest, leaving the field with a collection of disparate measures and ad hoc theories about FOI. With no shared basis for measuring and discussing FOI, we are unable to compare findings across studies of particular interventions or accumulate knowledge on FOI itself. (p. 200).

Purpose

This paper represents one effort to illuminate the “black box” of fidelity of implementation by describing the application of one promising framework, the innovation implementation framework (Century and Cassata, 2016; Century, Cassata, Rudnick, and Freeman, 2012; Century et al. 2010). Specifically, the paper describes how this framework was applied to identify and describe the implementation of critical components of a new curriculum implemented in sixth–eighth-grade engineering classes. Thus, the paper focuses on addressing the question: how can the innovation implementation framework be utilized to explore FOI for an innovative middle school engineering curriculum? The study provides practical examples of methodology and instruments employed to track curriculum implementation and illustrates how the framework facilitated the development of findings related to the curricula’s critical components.

Conceptual framework

The innovation implementation framework

Based on their work at the University of Chicago’s Center for Elementary Mathematics and Science Education (CEMSE) (now known as Outlier Research and Evaluation at UChicago STEM Education), Century and colleagues provide a useful conceptual framework for examining innovation implementation, defined as “the extent to which innovation components are in use at a particular moment in time” (Century and Cassata, 2014, p. 87). As implied by this definition, the innovation implementation framework conceptualizes innovation as complex and constituted by essential parts or components. The Framework identifies two major types of components: structural and interactional. Structural components include “organizational, design, and support elements that are the building blocks of the innovation” (p.88) and are further divided into procedural components (organizing steps, design elements of the innovation itself) and educative components (support elements that communicate what users need to know). Interactional components include the “behaviors, interactions, and practices of users during enactment” (p. 88) and are typically organized according to user groups (e.g., teachers, students). Within the category of interactional components, pedagogical components focus on the actions expected of teachers when implementing the intervention and learner engagement components focus on how students are expected to engage when participating in the intervention.

In addition to this approach to defining and categorizing components, Century et al. (2012) describe a number of key ideas related to the framework that are particularly relevant for the current study. First is the notion that innovations vary in terms of the number and type of components and the degree to which components are either explicit or implicit within the intended program model. Some innovations focus more on structural components while others prioritize interactional components. As described below, while we attended to certain structural components, we focus primarily on interactional components, which vary somewhat in the degree to which they are explicit within and across the sixth-, seventh-, and eighth-grade STEM-ID courses. Second, Century et al. emphasize that “full implementation of all critical components is not necessarily optimal, noting that appropriate enactment varies depending on contexts and conditions” (p. 348). Similarly, Century and Cassata (2014) discuss the difference between investigations of implementation fidelity, in which evidence is gathered to compare actual implementation to a theoretical ideal, and investigations focused on innovation use. Given the broad consensus that innovations are almost never implemented exactly as intended, Century et al. encourage measuring how components of innovation are used rather than focusing on fidelity of innovation as a whole. It is this conceptualization of innovation use that characterizes our approach to studying curriculum implementation. While we collaborated with curriculum developers to understand what they intended in order to identify critical components, we sought to go beyond determining whether the curriculum was implemented with fidelity to learn about how various components were enacted as the newly developed curricula unfolded in actual classrooms.

In addition to understanding how the innovation was enacted, we were also interested in learning about any contextual factors that may have influenced why teachers and students engaged with the curriculum the way they did. For this line of inquiry, we drew upon the Factor Framework (Century and Cassata, 2014; Century et al. 2012), which outlines a comprehensive set of potential factors influencing innovation enactment. These factors are organized into five categories: characteristics of the innovation, characteristics of individual users, characteristics of the organization, elements of the environment, and networks. Because our primary goal was to define and document the implementation of critical components, we did not seek to explicitly measure the multitude of factors within this framework that could have influenced implementation. However, the factors framework was a useful resource as we collected and analyzed interview and observation data related to contextual factors influencing implementation. Specifically, as we designed protocols and coded interview and observation data, we consulted the framework to identify characteristics of teachers, aspects of the intervention, and school-level contextual factors that may have either facilitated or hindered curriculum implementation.

In their work articulating the innovation implementation and factor frameworks, researchers from CEMSE provide a number of informative examples illustrating how they have utilized the frameworks to examine the implementation of educational innovations (Cassata et al. 2015; Century and Cassata, 2014; Century et al. 2012; LaForce et al. 2016). Century et al. (2012) describe how the innovation implementation framework was originally conceptualized as they sought to develop a suite of instruments to measure the implementation of K-8 reform-based science and mathematics curriculum across multiple programs. In another line of research, the frameworks were applied to examine the implementation, spread, and sustainability of the Everyday Mathematics program across multiple school districts (Cassata et al. 2015). LaForce et al. (2016) detail their use of the component approach in their national STEM School Study, which was conducted to identify the essential elements of STEM high schools. This work employed qualitative methods, including interviews with school leaders and analysis of school materials, to identify critical components within 20 STEM high schools in seven states. Findings resulted in the articulation of a framework including a total of 76 critical components of STEM schools organized into 8 elements.

Although this work describing the frameworks is frequently cited, beyond the research conducted by the frameworks’ originators, there are relatively few published studies illustrating exactly how the framework has been applied. Stains and Vickrey (2017) describe how they adapted the framework along with other approaches to FOI to study the implementation of evidence-based instructional practices (EBIP) in the context of discipline-based education research (DBER). This illustration focuses on defining and examining the critical components of one specific instructional practice, peer instruction (PI), implemented by STEM faculty. The study details the project’s process for defining critical components of PI, including both structural and instructional (e.g., interactional) components. Offerdahl, McConnell, and Boyer (2018) utilized aspects of the framework to hypothesize a set of critical components of formative assessment including both structural components (e.g., learning objectives, formative assessment prompts) and instructional components (e.g., revealing student understanding, diagnosis of in-progress learning).

Curriculum context

The STEM Innovation and Design (STEM-ID) curriculum consists of a series of semester-long (18 weeks) 6th-, 7th-, and 8th-grade engineering courses in which students engage in contextualized design challenges. Grounded in problem-based learning (Barrows, 1986; Krajcik et al. 1998), each grade-level curriculum is designed to build specific requisite skills leading up to a final design challenge. Table 1 provides a summary of the major activities included in the sixth-, seventh, and eighth-grade courses.

Table 1 STEM-ID curriculum overview

Our application of the framework focuses primarily on a 2-year implementation period, which followed a 2-year development period during which the STEM-ID curriculum had been iteratively refined based on feedback from teachers and classroom observations. The first year of the implementation period (year 1) focused on identifying and documenting the critical components in schools implementing the curriculum. The second year of the implementation period (year 2) focused on confirming and elaborating upon year 1 findings. In the interest of deepening our understanding of factors influencing the implementation and how teachers’ approach to implementing the curriculum evolved, supplementary follow-up data were collected from a targeted sample during the third year of implementation (year 3).

During the implementation period, STEM-ID was the primary curriculum in technology classrooms in each of the four middle schools within the public school district participating in our NSF-sponsored Math-Science Partnership (MSP) project. The district is located in an urban fringe area outside a major city in the southeastern USA. The district serves a predominantly low-income student population, with 67% of students qualifying for free/reduced lunch. The district is also relatively diverse, with sub-groups including White (45%), Black (44%), Hispanic (7%), and other (5%) students.

A total of six teachers participated in the study, one teacher at each of the four schools, with two teachers leaving between the first and second implementation years and being replaced with teachers new to the district. Consequently, teachers’ experience with the curriculum varied across schools, from teachers who had been involved in the project from its inception to teachers implementing the curriculum for the first time. Similarly, teachers’ experience with professional development varied, with teachers involved as the curriculum was being developed participating in more formal, intensive professional development (e.g., summer workshops, training sessions, frequent site visits) than the professional development offered during the implementation period, which consisted mainly of individual consultations with curriculum developers and occasional informal site visits or webinars.

Identifying the critical components of the STEM-ID curriculum

Identifying the critical components of innovation represents a critical step in the process of studying implementation (Century et al. 2012; Ruiz-Primo, 2006). Thus, the project’s initial investigation of fidelity of implementation began with an in-depth curriculum review, exploratory classroom observations, and a series of informal interviews with teachers and curriculum developers in order to define the critical components of the curriculum. These efforts focused on determining critical components that were both reflective of the overall goals of the STEM-ID courses and clearly operationalized within the sequence of activities for each grade level curriculum. Ultimately, we identified 10 critical components including two structural components and an additional eight interactional components (Table 2). The structural components include one procedural component (the organization of the course according to contextualized problem-based challenges) and one educative component (the utilization of curriculum materials). In addition to following the framework’s guidance on organizing interactional components according to user groups (teachers, students), we anticipated the need to distinguish between teacher and student engagement with the critical components of the curriculum. Thus, the eight interactional components represent parallel teacher and student activity in four areas: the engineering design process, advanced manufacturing technology, collaboration, and the integration of math and science.

Table 2 STEM-ID critical components

Each of the components is evident in each grade level course; however, there are variations in how components manifest across grade levels. In some instances, this variation is due to the intentional scaffolding of the curriculum from one grade level to the next. For example, while students at all three grade levels have some exposure to advanced manufacturing technology, this component is much more explicit in the seventh and eighth grade when students utilize CAD software and 3D printing technology than it is in sixth grade, which focuses mainly on developing students’ prerequisite engineering drawing skills. Thus, defining critical components involved not only identifying which components were crucial within the overall intervention but also understanding variations in how curriculum developers envisioned the components working at each grade level.

Although the identification of the critical components occurred as the first step in our study of implementation, the process of determining the components was iterative, with the initial list of components being refined several times as our understanding of the curriculum and the intentions of its developers evolved. One of the major points of negotiation and revision came when deciding which components were not critical. Certain components were excluded because, while potentially positive outcomes, they were not considered central to the curriculum or necessary for successful implementation. This was the case for “STEM Career Connections”, which was included in early discussions as a possible interactional component. Although curriculum developers and teachers certainly hoped that engagement with the curriculum would spark student interest in exploring STEM careers and there were certain aspects of the curriculum that could be seen as conducive to fostering understanding of STEM career connections, we ultimately decided that doing so was not considered critical for successful implementation.

Similarly, discussions with the curriculum developers allowed us to determine the degree to which teacher editions and various other curriculum materials contained critical educative resources versus guidance that was expected to be helpful but not necessarily crucial for the successful enactment of the curriculum. Through these discussions, we considered but eventually eliminated a number of structural elements as critical components. The curriculum’s teacher editions provide guidance for instructional delivery including the amount of time recommended for particular activities, the sequence of particular student and teacher actions within each of the challenges, and whether activities are designed to be completed in groups or individually. As STEM-ID courses are implemented within the context of an elective technology course, there was a general feeling that a more loosely structured curriculum worked well for teachers who tend to have more latitude to experiment with their instructional practice than colleagues in core academic subject areas. Indeed, having observed early implementation during the curriculum’s 2-year development period, the curriculum developers saw the flexibility of the curriculum and the degree of autonomy it afforded teachers as a strength of the innovation. One area where allowing for teachers’ discretion seemed particularly important was in determining whether particular activities would be completed by individual students, in small groups, or as whole class exercises. For example, the seventh-grade curriculum includes a rather in-depth tutorial intended to teach students how to use a CAD software program (Iron CAD). Curriculum materials suggested that students should work through the tutorial materials individually, at their own pace over the course of a 2 to 3-week period. However, exploratory observations and discussions with teachers indicated that, due to challenges with reading comprehension and the overall difficulty of mastering the new software, some students were more engaged and successful when the teacher led them through the tutorial as a group.

Other portions of the curriculum materials and teacher editions were considered indispensable educative resources. For example, utilization of the Engineering Design Log (EDL), a digital tool designed to guide the engineering design process, was considered critical. Embedded within the EDL were important definitions and cues meant to build students’ and teachers’ understanding of key engineering concepts so they could successfully navigate the engineering design process as they completed the challenges. For example, the “ideate” section of the EDL prompts students to provide pictures or sketches and descriptions of concepts in a table with the following instructions: “this table should include simple images and/or descriptions of any design concepts brainstormed by you and your team. At least three independent concepts should be brainstormed prior to evaluating. Add on additional concepts as you iterate.” Within these instructions are cues for students and teachers intended to reinforce the concept that the engineering design process is iterative, that designs should be both illustrated and described, and that students should engage in an extended brainstorming process in which they identify multiple design concepts prior to evaluating potential solutions. Other critical educative elements within the curriculum materials included overviews and documents introducing each of the grade-level challenges, information on the math and science standards and concepts aligned to each challenge, and detailed instructions for using and teaching students to use the tools and advanced manufacturing technologies included within the courses (e.g., CAD software, LEGO Robotics, 3D printers).

Data sources

Following the identification of critical components, the project utilized classroom observations, teacher interviews, and student interviews to explore the degree to which implementation evidenced each critical component.

Classroom observations

Three researchers conducted observations in two classrooms over the course of a three-week period during year 1 with more targeted follow-up observations in year 2. Due to limitations in resources, we were not able to conduct extended classroom observations at all four school sites. Thus, we purposively opted to observe at schools with teachers of varying experience levels. The research team focused observations in one school that was new to the project and one school that had participated in the project from its inception. In year 1, observations were conducted each day of the 2 to 3-week period during which teachers in these two schools implemented the culminating design challenge in their sixth, seventh, and eighth-grade classes. Due to some scheduling limitations, researchers were not always able to observe every class period; however, each observer did conduct continuous observations in at least one class period at each grade level at each school. As year 2 observations were intended primarily to confirm findings from data collected during the previous school year, the research team decided to focus year 2 classroom observations on a 5-day period during the implementation of the final design challenges in the same two teachers’ classrooms.

Observations were guided by a semi-structured protocol intended to provide guidance on specific elements related to critical components while remaining sufficiently general to be used for all three grade-level courses. Therefore, the protocol included both checklist items and space devoted to field notes related to each critical component. For example, in the section of the protocol aligned to the Engineering Design Process, observers check which of the six stages of the process students engaged in and then record accompanying written observations in the space provided. The protocol also includes space for observers to rate the overall level of student engagement and to describe adaptations in the event that an activity was implemented in a way that is noticeably different from how it was planned or described in the curriculum’s teachers’ edition. See Table 3 for an excerpt from the observation protocol.

Table 3 Example items from STEM-ID observation protocol

Interviews

Semi-structured interviews were conducted with each of the teachers who implemented the STEM-ID courses. A total of 10 individual interviews were conducted including annual interviews with each teacher implementing the curriculum during year 1 and year 2 and follow-up interviews with two teachers in year 3. Each of these interviews occurred at the end of a semester, as teachers were completing the implementation of the STEM-ID courses. Interview discussions were guided by a semi-structured protocol developed by project researchers (Table 4). The protocol includes questions and follow-up prompts aligned to each critical component along with questions aligned to two areas within Century et al.’s (2012) Factor Framework: characteristics of users (teachers) and characteristics of the organization (schools). These questions were intended to gather preliminary data related to teacher characteristics and school-level factors that may influence curriculum implementation. Interviews lasted 45–60 min and were conducted in a quiet area (classroom, media center) at each school. An additional joint interview was conducted with two teachers attending the project’s professional development institute held during the summer between year 1 and year 2. This joint interview was utilized primarily as an opportunity to engage teachers more explicitly in a discussion of implementation and the curriculum’s critical components. Specifically, this discussion focused on having teachers generate and discuss what they believed were the critical components of the STEM-ID courses. All teacher interview sessions were audio-recorded and transcribed for analysis.

Table 4 Example items from STEM-ID interview protocols

Student interviews were intended to gain insight into the experiences of sixth–eighth-grade students participating in the project, with a sub-set of questions related to various critical components. Interviews were conducted with students in all four schools at the end of year 1. A stratified sampling procedure was utilized in order to select a sample of 92 students (6th grade n = 32; 7th grade n = 34; 8th grade n = 26) representative of a range of academic achievement levels. The demographics of the interview sample were representative of the district with regard to race/ethnicity, socio-economic status, and gender. Student interviews lasted approximately 20 min and were conducted by one of four researchers in a quiet area in each school during the STEM-ID class meeting time. Similar to teacher interviews, student interviews utilized a semi-structured protocol with questions and prompts aligned to each of the critical components (Table 4). All interviews were audio-recorded and transcribed for analysis.

Data analysis

In order to expedite data analysis and reporting of teacher interview data to the curriculum team, contact summary forms (Miles, Huberman, andSaldaña, 2014) were completed following each teacher interview. These forms were intended to capture researcher impressions and document the main points and potential themes that emerged in each interview. As an initial phase of analysis, summary forms were reviewed alongside interview transcripts to generate individual teacher summaries, which were then shared with curriculum developers to inform immediate refinements to the curriculum and professional development.

Interview and observation data were then analyzed using a process of sequential qualitative analysis recommended by Miles et al. (2014). In the first stage of analysis, three coders applied a provisional start-list of codes to a sub-set of the student interview data. Coding focused on identifying instances within interview and observation data that illustrated teacher and student experiences with the critical components as the curriculum was implemented. Following discussion among the coders, the initial set of codes was refined in order to further clarify code definitions and incorporate additional sub-codes related to the critical components. For example, through these discussions, the research team decided to expand the codes related to the engineering design process component in order to more specifically identify the particular stages and aspects of the engineering design process and to include a number of potential themes or patterns that emerged from the initial coding of the data. After achieving reliability (92% agreement) with the revised coding scheme with a second sub-set of interviews, the remaining interviews were divided among the three researchers for coding. An excerpt from the coding scheme utilized to code student interviews is provided in Table 5. Using a similar coding scheme, the teacher interviews were coded by a member of the research team who conducted the interviews. In addition to coding for the critical components, in order to identify interview data pertaining to specific factors that may have influenced the implementation of the curriculum, the code list for teacher interviews included a number of relevant factors from the factor framework. Observation data were analyzed by synthesizing protocols for all observed class sessions, with data from the checklist items related to critical components compiled in a spreadsheet. Observation notes were coded using a code list similar to the one used for interview data. All interview transcripts and observation field notes were coded using the NVIVO software program.

Table 5 Excerpt of coding scheme

Coded interview observation data were then synthesized to create a series of conceptually clustered matrices describing findings pertaining to each critical component. In addition to matrices describing general trends in implementation for each component across teachers and students, matrices were created to illustrate various implementation patterns (e.g., variations within particular critical components). These matrices were then utilized to draft narrative summaries describing the implementation of the critical components.

Findings

In this section of the paper, we describe our application of the innovation implementation framework. Because this paper is intended primarily to illustrate our methodology and application of the framework, we do not provide a full elaboration of implementation findings for each of the critical components. Rather, we provide examples to illustrate how data gathered using this approach advanced the projects’ understanding of implementation. Specifically, we provide illustrative examples describing how the framework facilitated the project’s capacity to: 1) clearly specify the curriculum’s critical components and their enactment, 2) design instruments aligned to critical components, and 3) detect implementation patterns to inform curriculum refinement and teacher professional development.

Specification of curriculum and its enactment

Extant FOI research emphasizes the clear specification of innovations as an important step in studying implementation (Century and Cassata, 2014, 2016; Ruiz-Primo, 2006). Both through the identification of critical components and our subsequent inquiry related to each component, we were able to arrive at a clear specification of the curriculum. We found the componential approach to be particularly useful within the ever-evolving context of a major design-based implementation research project. Taking the time to consult with curriculum developers to identify critical components created a common language among the researchers, curriculum developers, and program partners. Using critical components as a sort of guidepost within our investigation allowed for a level of clarity and specificity that would be difficult to achieve if we had simply compared observations and accounts of implementation from teachers and students to a general, idealized version of what should be happening in the STEM-ID classroom.

One instance that highlighted the utility of critical components was the joint interview with two teachers conducted during the project’s summer institute held between year 1 and year 2. In this session, we introduced teachers to the idea of critical components and, before sharing the components we had identified, asked the teachers to generate what they believed were the critical components of STEM-ID. Interestingly, with one notable exception, the critical components identified by teachers fell into the categories we had previously identified (math/science integration, engineering design process, advanced manufacturing technology). The exception was collaboration, which teachers did not spontaneously identify as a component of the curriculum. When triangulated with interview and observation data related to collaboration, this omission and subsequent discussion of it by the teachers lent valuable insight into the specific challenges of facilitating collaboration within STEM-ID classrooms. In turn, we were then able to further engage curriculum developers in discussions regarding expectations for collaboration, refining our understanding of the variations in collaboration that fall within the boundaries of acceptable curriculum implementation. Through these conversations, we were able to distinguish between activities where teachers could use their discretion about whether and how students collaborate and activities where collaboration was essentially non-negotiable. For example, within the seventh-grade course, teachers could use their discretion when implementing the Iron CAD tutorial in which students learn 3D modeling, as this is an activity that students could complete individually, in groups, or as a whole class. Indeed, in our observations, we noted all three approaches as the tutorial was implemented.

Instrument design and data collection

Following the initial specification of critical components, we were able to develop an array of tools (observation protocols, students, teacher interview protocols, coding schemes) to facilitate the collection and analysis of implementation data. Having a clearly defined set of critical components focused on both the development of our research instruments and the actual process of data collection. This was particularly important for our project, given the scope of the semester-long curriculum and limitations on resources for data collection and analysis. We were able to strategically target classroom observations for curriculum sessions where students and teachers would be most likely to engage in the critical components. Specifically, we made the decision to observe in classrooms during the final weeks of curriculum implementation during culminating design challenges because we knew there were opportunities for each of the critical components to manifest in STEM-ID classrooms during that portion of the curriculum. We were also able to use the critical components to guide semi-structured interviews with students and teachers, choosing to use limited interview time for follow-up questions and probes related to the critical components over more tangential topics.

In some cases, the critical components informed adaptations to our data collection strategy. For example, in classroom observations conducted to test our protocol, we noted the challenge of discerning whether students were engaged in activities related to the advanced manufacturing technology component, the engineering design process component, or both. Often students spent the majority of each class period working on various activities on their computers, but researchers who observed passively found it difficult to document what students were doing vis-a-vis the critical components. Given this challenge, we revised our classroom observation strategy to include instructions for researchers to periodically walk around the classroom so they could get a better view of students’ computer screens and more accurately document engagement with the specific critical components. While this is perhaps a challenge inherent in classroom observations any time student engagement is mediated by computers and therefore less obvious to observers, having the critical components as our guide for classroom observations made developing a strategy to address this challenge a priority. If we had been observing implementation using a more general approach, we may have been less dissatisfied with general observations of student engagement. Similarly, if we had been focused on typical fidelity criteria such as dosage, we may be addressed this challenge by adjusting our strategy to simply document whether students were engaged in any activity within the curriculum (e.g., recording whether students were on-task or off-task) rather than worrying about determining how students engaged with the curriculum’s critical components.

True to the intent of the framework, because our data collection instruments centered on the critical components, they were both well-aligned with the intervention while also being general enough that they could conceivably be used across projects investigating similar components. For example, interview questions and portions of our observation protocol designed to investigate the implementation of the engineering design process component were developed according to the specific EDP model used in the course. At the same time, to the extent that EDP models tend to be quite similar across interventions, our instruments could easily be utilized or adapted by other interventions centered on engaging students in the engineering design process. Indeed, we expect that our instruments may be of particular interest to researchers in K-12 engineering education, an area of educational research that, in recent years, has begun to establish a stronger tradition of qualitative research (Case and Light, 2011).

Expectations that certain components (engineering design process) were absolutely central and explicit whereas others were more implicit (math/science integration) and perhaps not quite as critical (collaboration) also informed our data collection strategy. For example, we knew that interview data would likely provide important additional information and context related to the more implicit components that may not be evident in classroom observations, so we made sure to devote significant time to the more implicit components within our interviews.

Revealing implementation patterns

Our utilization of the innovation implementation conceptual framework revealed a number of implementation patterns related to the critical components. Rather than providing a simplistic assessment of the average fidelity of STEM-ID implementation, we were able to describe variations in implementation across and sometimes within each of the critical components, across teachers, and across the three grade-level courses. For example, we found clear variations within the advanced manufacturing technology critical component, with student engagement and teacher facilitation fluctuating depending on the manufacturing technology being utilized. While 3D printing technologies were embraced by teachers and students nearly universally, there was a greater reluctance to use the CAD program (i.e., Iron CAD) introduced in the seventh-grade course both among students and, to varying degrees, among teachers.

Century et al. (2012) note that components can vary in the degree to which they are implicit or explicit within an innovation and some critical components may be more “critical” than others. Indeed, as evident by our teachers’ omission of collaboration in the focus group described above and subsequent observation and interview data, we found that the collaboration component was one of the more implicit within the curriculum. Similarly, although teachers and students could recall the contextualized problem-based challenges that organized the course, beyond the initial class sessions in which these problems were presented, students and teachers rarely referenced the challenges and they did not seem to motivate students’ ongoing engagement with curriculum activities. Thus, while clearly implemented as prescribed, the contextualized problem-based challenges component seemed to be less critical than others.

Century and Cassata et al. (2015) emphasize the contextual nature of implementation, defining innovation implementation as “the extent to which innovation components are in use at a particular moment in time” (p. 88). In addition to providing snapshots of curriculum implementation and various timepoints, data collected over multiple years of implementation allowed us to examine the persistence of certain implementation patterns or tendencies as different students and teachers interacted with the curriculum. For example, in our effort to document student and teacher engagement with the engineering design process in year 1, we accumulated clear evidence that students tend to engage actively in certain stages of the design process (e.g., prototyping and testing) while frequently neglecting other stages (e.g., identifying and understanding the problem). We then found the same uneven engagement across the stages of the engineering design process in each of the classrooms where we collected implementation data in year 2. While we cannot necessarily generalize this finding beyond the project, examining implementation data related to the engineering design process as a critical component in multiple classrooms over 2 years suggests that this tendency was not merely typical of a certain cohort of students or an individual teachers’ enactment but perhaps a challenge inherent in the curriculum and, possibly, teaching the engineering design process.

By triangulating interview and observation data, we were able to discern and create matrices illustrating low, moderate, and high implementation levels for each critical component by teacher and implementation year. Figure 1 provides an example of an implementation matrix for two teachers. Examining patterns in this data, we noted a tendency for teachers to begin prioritizing certain critical components as they gained more experience with the curriculum. For example, in the following excerpt from an interview with teacher 2 conducted at the end of his second year implementing the curriculum, he discusses how he decided to prioritize the advanced manufacturing technology and math/science integration components over the engineering design process.

Teacher 2: You know, I don’t think that they understand the engineering process super well. I really think that I spend most of my time teaching the IronCAD skills…So my main focus has been IronCAD and highlighting math whenever we can in the curriculum and if it takes longer because we have to stop and teach a unit on decimals and on rounding or on measurement, you just prioritize, and you know, focus, understanding that ‘hey, if my kids cannot add decimals and they can’t round decimals, they can’t even write money as a decimal, then they can’t get a favorable outcome in the carnival challenge’. They can’t do the challenge ‘cause you have to be able to do that…

Interviewer: So it sounds like you really focus on building those foundational math skills even if that meant that maybe students wouldn’t be able to go through the entire engineering design process.

Teacher 2: Yeah, they don’t get done, we don’t get as far in the curriculum as I would like, but it’s like, when your students are so below where they need to be to even accomplish the outcome, you gotta’ start and get their skills built up.

This teacher goes on to describe supplemental materials he had created aligned to these components, including math and science lesson plans and a collection of instructional videos he made to guide students as they learned the CAD software used in the curriculum.

Fig. 1
figure 1

Matrix illustrating levels of implementation for two teachers by year

Implementation factors

Although this study does not focus on exploring factors influencing implementation, we did use the Factor Framework to guide our discussions of implementation context during teacher interviews. Overall, we found the taxonomy of factors related to the intervention, users (teachers), and environment to be a useful lens for beginning to understand how certain contextual factors may have influenced implementation. For example, cross-case analysis of teacher interview data indicated that school-level policies regarding support of special education students had clear implications for the implementation of certain critical components. Specifically, teachers explained that because the curriculum was being implemented in a “connections” course rather than in one of the core subject-area classes, students who typically had an aide to assist them did not receive this support during their class sessions. This meant that most teachers struggled to provide adequate support for students with special needs, especially for activities that required grade-level reading comprehension or mathematics skills. For instance, teachers reported that lower-performing seventh-grade students typically did not have the requisite reading comprehension skills to complete the tutorial teaching them how to use the Iron CAD software program they would use throughout the seventh- and eighth-grade courses.

Interview data also suggested a number of teacher characteristics that seemed to influence curriculum implementation including teachers’ self-efficacy for particular aspects of the curriculum, previous career and teaching experiences, and understanding of the engineering design process. Asked to discuss their level of confidence with the curriculum, teachers often indicated that they were quite confident overall, while noting particular areas where they felt particularly capable and areas where they had doubts about their ability to implement the curriculum. The areas where teachers felt most confident tended to align with their previous career and teaching experience. For example, teachers with a background in engineering or manufacturing tended to express more confidence with regard to engaging students in the engineering design process and utilizing advanced manufacturing technology than teachers who entered teaching from other professions or transferred to teaching the engineering course from teaching math, science, or other subjects. Similarly, teachers with no prior mathematics or science teaching experience tended to report that they were less inclined to focus on the math and science concepts within the course than teachers with experience teaching middle school math or science. Although professional development sessions aimed to equip teachers with a working understanding of the engineering design process and most teachers demonstrated a clear understanding of the nature of the engineering design process and the activities expected at each stage of the process, interviews occasionally revealed misconceptions that likely influenced implementation. For example, similar to students’ tendency to confuse the “evaluate” stage, during which potential concepts are evaluated for whether they meet requirements, with the “prototype and test” stage, during which a working prototype is constructed and tested, interview data indicated that one teacher also had difficulty with the distinction between these two stages of the engineering design process.

Our findings, including implementation patterns and data suggesting potential factors influencing implementation, were communicated to the project team in order to inform refinements to curriculum materials and teacher professional development. For instance, findings on student use of advanced manufacturing technology, led to the simplification of tutorial materials guiding students as they learned the CAD software used in the seventh-grade course. Implementation patterns showing variations in student engagement and teacher facilitation of the engineering design process helped the curriculum development team prioritize the refinement of the Engineering Design Log, building in additional educative prompts intended to address misconceptions and increase the likelihood that students would engage meaningfully with each stage. Similarly, based on findings that teachers tended to emphasize certain stages of the engineering design process over others, the project adjusted professional development sessions to include additional facilitating strategies and reinforce the purpose of and interconnections among the stages.

Discussion

Although measuring implementation has become an important step in the development of STEM innovations, too often FOI work is relegated to secondary status within larger research projects. Consequently, the methods and findings of implementation studies often remain unpublished, shared only internally within a research project or with funding agencies. Over the years, researchers have offered a number of promising frameworks and approaches for investigating implementation (Century and Cassata, 2016; Ruiz-Primo, 2006); however, without specific examples of how these approaches work in practice, it may be difficult for research teams to select the most appropriate strategy for understanding how innovations unfold in schools and classrooms. By illustrating the application of the innovation implementation framework, we provide one example that may be instructive for other projects interested in the implementation of STEM innovations.

As illustrated in the findings above, the application of the innovation implementation framework clearly enhanced our investigation of curriculum implementation. The framework’s componential approach enabled us to clarify the project’s understanding of what was critical within the STEM-ID curriculum and to design an implementation study that focused not merely on whether implementation resembled the intentions of curriculum developers but also on how this new curriculum was actually being used by teachers and students. Once defined, the critical components focused our efforts to develop data collection protocols, collect interview and observation data, and analyze that data to reveal implementation patterns.

We found that the flexibility of this framework, with the ability to define any number of critical components that may be either implicit or explicit and more or less “critical”, to be particularly important for the types of implementation activities we were focused on. Although common fidelity criteria (e.g., dosage) may provide important evidence for many projects, such data would have been of relatively little interest and quite difficult if not impossible to collect with any reliability within the context of our project. For example, because students work on design challenges collaboratively, at their own pace over the course of several weeks, often with only occasional guidance from their teacher, determining dosage (e.g., amount of time spent) for the engineering design process overall, or within any one stage of the process, would have been exceedingly difficult. At the same time, although the curriculum provided general guidelines on pacing and our interview and observation data did provide useful information on engagement across the curriculum, knowing how much time teachers or students spent engaged in any specific activity was not of particular interest to the research team. What was of interest were questions of curriculum use such as how would eighth-grade students use LEGO robotics to complete their engineering design challenge? What particular manufacturing technologies were utilized in the classroom and were these utilizations student or teacher-centered? Would teachers with limited mathematics and science background embrace the curriculum’s math/science integration component, or would these portions of the curriculum be treated superficially or ignored altogether? It is these types of questions centered on use that we found the innovation implementation framework most useful for addressing.

Often FOI literature describes the study of implementation in the context of efficacy or effectiveness research (O’Donnell, 2008; Stains and Vickrey, 2017). Efficacy studies are typically concerned with outcomes when an intervention is implemented under what may be considered “ideal” conditions, such as with highly trained teachers or with additional support for implementation. Effectiveness studies focus on outcomes when interventions are implemented “under conditions of routine practice” (Institute of Education Sciences, 2013). Although we can certainly see the benefits of applying the innovation implementation framework for efficacy and effectiveness studies, given the value we found in using the framework for early-stage implementation research at the design and development stage, we suggest the possibility that a project need not be at the stage where efficacy studies have been conducted in order to be a “candidate” for an implementation study. Following 2 years of iterative refinement based on the initial implementation in engineering classrooms, we found that the curriculum was sufficiently developed that the project could specify critical components and corresponding expectations for teacher facilitation and student engagement. Although the curriculum was not being implemented under optimal conditions, as in an efficacy study, we found that examining implementation patterns across a variety of classrooms only deepened our understanding of how the curriculum was being used and what would be required for successful implementation. Indeed, based on our experience with curriculum development in the context of a large design-based implementation research project, we would recommend using the framework to examine innovation use rather than strict fidelity criteria in design and development studies.

Consistent with previous work examining teachers’ curriculum adaptations and variations in enactment (Buxton et al. 2015; DeBarger et al. 2017; McNeill et al. 2018), the implementation patterns and factors influencing implementation we identified underscore the importance of attending to implementation context and the ways in which teachers exercise agency as they enact innovations. Although certain enactments, such as skipping stages of the engineering design process, were clearly counter to the goals of the curriculum, teachers also adapted and supplemented the curriculum in ways that clearly supported the engagement of diverse students’ in the design challenges.

Consistent with recommendations from previous implementation research (Ruiz-Primo, 2006), we found that the collection and triangulation of multiple data sources were crucial. Although classroom observations are resource-intensive, we agree with other researchers that they are indispensable when exploring implementation. However, as noted above, the framework allowed us to focus our classroom observations on certain portions of the curriculum where critical components were most likely to be evident rather than investing resources in observing over the course of the semester-long curriculum. Although implementation is most directly measured through observation, we found that interview data provided important additional information and context for the interpretation of observation data. Perhaps due to the logistic and methodological challenges inherent in collecting student interview data, studies including student interviews seem to be far less common than studies employing teacher interviews. Given the student-centered nature of the curriculum and the number of critical components that depended on student engagement, we found student interviews to be a fruitful data source, well worth the time and effort invested in data collection.

Limitations

Our application of the innovation implementation framework was not without limitations. Due to finite resources for data collection, our classroom observation data was somewhat limited in scope, focusing on a sub-set of teachers implementing a targeted portion of the curriculum. In combination with interview data, we found that these observations yielded useful implementation data; however, observations conducted over a longer time-span in more classrooms would have strengthened our study. Likewise, the analysis of additional data sources may have further enhanced our understanding of implementation. Indeed, the implementation of the curriculum generated a significant body of document data that we opted not to include in the study. For example, the research team has access to the digital engineering design log students complete as they work through the design challenges. Analysis of these logs could provide additional insights into student engagement in the engineering design process. However, given that our study was being conducted in the context of a design-based implementation research project, we determined that the timeline required to analyze this document data was not well aligned with project plans to revise and prepare to scale the curriculum.

Conclusion

This work addresses important issues pertaining to the implementation of innovations in STEM education. The paper provides a much-needed description of the application of the innovation implementation framework (Century and Cassata, 2014). Beyond the descriptions provided by the framework’s authors, there are few published examples of research applying this approach to study the implementation of curricular innovations. Thus, we expect that this work is of interest among researchers considering using this framework to guide their implementation research and that sharing this example will, perhaps, encourage other projects to disseminate their FOI research. Although this paper focuses primarily on the application of a framework for studying implementation rather than a comprehensive reporting of implementation findings, the illustrative examples we share are likely to resonate with educators, researchers, curriculum developers, and school leaders invested in STEM education initiatives. By identifying critical components and sharing patterns observed in our implementation data, we hope to contribute to the field’s ability to compare findings across studies investigating similar curricular innovations.

Availability of data and materials

Data sharing is not applicable to this article. Methodology and examples from data analyses were provided for illustrative purposes; however, the article does not present full analyses of the related dataset.

Abbreviations

DBER:

Discipline-based education research

EBIP:

Evidence-based instructional practices

FOI:

Fidelity of implementation

PI:

Peer instruction

STEM:

Science, Technology, Engineering, and Mathematics

STEM-ID:

Science, Technology, Engineering, and Mathematics Integrating Design

References

  • Barker, B. S., Nugent, G., & Grandgenett, N. F. (2014). Examining fidelity of program implementation in a STEM-oriented out-of-school setting. International Journal of Technology and Design Education, 24(1), 39–52. https://doi.org/10.1007/s10798-013-9245-9.

    Article  Google Scholar 

  • Barrows, H. S. (1986). A taxonomy of problem-based learning methods. Medical Education, 20(6), 481–486. https://doi.org/10.1111/j.1365-2923.1986.tb01386.x.

    Article  Google Scholar 

  • Borko, H., & Klingner, J. K. (2013). Supporting teachers in schools to improve their instructional practice. B. J. Fishman, W. R. Penuel, A.R. Allen, & B. H. Cheng (Eds), Design-based implementation research. National Society for the Study of Education Yearbook. 112(2), 298–319. New York.

  • Borrego, M., Cutler, S., Prince, M., Henderson, C., & Froyd, J. E. (2013). FOI of research-based instructional strategies (RBIS) in engineering science courses. Journal of Engineering Education, 102, 394–425. https://doi.org/10.1002/jee.20020.

    Article  Google Scholar 

  • Buxton, C. A., Allexsaht-Snider, M., Kayumova, S., Aghasaleh, R., Choi, Y. J., & Cohen, A. (2015). Teacher agency and professional learning: Rethinking fidelity of implementation as multiplicities of enactment. Journal of Research in Science Teaching, 52(4), 489–502. https://doi.org/10.1002/tea.21223.

    Article  Google Scholar 

  • Case, J. M., & Light, G. (2011). Emerging methodologies in engineering education research. Journal of Engineering Education, 100(1), 186–210.

    Article  Google Scholar 

  • Cassata, A., Kim, D. Y., & Century, J. (2015). Understanding the “why” of implementation: Factors affecting teachers’ use of everyday mathematics. In Paper presented at Annual meeting of the American Educational Research Association, Chicago, IL.

    Google Scholar 

  • Castro Superfine, A., Marshall, A. M., & Kelso, C. (2015). Fidelity of implementation: Bringing written curriculum materials into the equation. Curriculum Journal, 26(1), 164–191. https://doi.org/10.1080/09585176.2014.990910.

    Article  Google Scholar 

  • Century, J., & Cassata, A. (2014). Conceptual foundations for measuring the implementation of educational innovations. In L. M. H. Sanetti & T. R. Kratochwill (Eds.), Treatment integrity: A foundation for evidence-based practice in applied psychology (pp. 81–108). Washington, D.C.: American Psychological Association.

    Chapter  Google Scholar 

  • Century, J., & Cassata, A. (2016). Implementation research: Finding common ground on what, how, why, where, and who. Review of Research in Education, 40(1), 169–215. https://doi.org/10.3102/0091732X16665332.

    Article  Google Scholar 

  • Century, J., Cassata, A., Rudnick, M., & Freeman, C. (2012). Measuring enactment of innovations and the factors that affect implementation and sustainability: Moving toward common language and shared conceptual understanding. The Journal of Behavioral Health Services & Research, 39(4), 343–361. https://doi.org/10.1007/s11414-012-9287-x.

    Article  Google Scholar 

  • Century, J., Freeman, C., & Rudnick, M. (2008). Measuring fidelity of implementation of instructional materials: A conceptual framework. In Paper presented at Annual Meeting of the American Educational Research Association, New York, NY.

    Google Scholar 

  • Century, J., Rudnick, M., & Freeman, C. (2010). A framework for measuring fidelity of implementation: A foundation for shared language and accumulation of knowledge. American Journal of Evaluation, 31(2), 199–218. https://doi.org/10.1177/1098214010366173.

    Article  Google Scholar 

  • Dane, A. V., & Schneider, B. H. (1998). Program integrity in primary and early secondary prevention: Are implementation effects out of control? Clinical Psychology Review, 18(1), 23–45.

    Article  Google Scholar 

  • DeBarger, A. H., Penuel, W. R., Moorthy, S., Beauvineau, Y., Kennedy, C. A., & Boscardin, C. K. (2017). Investigating purposeful science curriculum adaptation as a strategy to improve teaching and learning. Science Education, 101(1), 66–98. https://doi.org/10.1002/sce.21249.

    Article  Google Scholar 

  • Dusenbury, L., Brannigan, R., Falco, M., & Hansen, W. B. (2003). A review of research on fidelity of implementation: Implications for drug abuse prevention in school settings. Health Education Research, 18(2), 237–256. https://doi.org/10.1093/her/18.2.237.

    Article  Google Scholar 

  • Fishman, B. J., Marx, R. W., Best, S., & Tal, R. T. (2003). Linking teacher and student learning to improve professional development in systemic reform. Teaching and Teacher Education, 19(6), 643–658. https://doi.org/10.1016/S0742-051X(03)00059-3.

    Article  Google Scholar 

  • Fullan, M., & Pomfret, A. (1977). Research on curriculum and instruction implementation. Review of Educational Research, 47, 335–397. https://doi.org/10.2307/1170134.

    Article  Google Scholar 

  • Furtak, E. M., Ruiz-Primo, M. A., Shemwell, J. T., Ayala, C. C., Brandon, P. R., Shavelson, R. J., & Yin, Y. (2008). On the fidelity of implementing embedded formative assessments and its relation to student learning. Applied Measurement in Education, 21(4), 360–389. https://doi.org/10.1080/08957340802347852.

    Article  Google Scholar 

  • Gruber-Hine, L. K. (2018). Engineering Is Elementary: Identifying instances of collaboration during the engineering design process. (Doctoral dissertation). Retrieved from https://surface.syr.edu/etd/848.

    Google Scholar 

  • Institute of Education Sciences (Ed). (2013). Common guidelines for education research and development. https://www.nsf.gov/pubs/2013/nsf13126/nsf13126.pdf

    Google Scholar 

  • Institute of Education Sciences. (2020). Education Research Grant Solicitation. https://ies.ed.gov/funding/pdf/2020_84305A.pdf. Accessed 18 Nov 2019.

    Google Scholar 

  • Johnson, S. R., Pas, E. T., Bradshaw, C. P., & Ialongo, N. S. (2018). Promoting teachers’ implementation of classroom-based prevention programming through coaching: The mediating role of the coach-teacher relationship. Administration and Policy in Mental Health and Mental Health Services Research, 45(3), 404–416. https://doi.org/10.1007/s10488-017-0832-z.

    Article  Google Scholar 

  • Kimpston, R. D. (1985). Curriculum fidelity and the implementation tasks employed by teachers: A research study. Journal of Curriculum Studies, 17, 185–195. https://doi.org/10.1080/0022027850170207.

    Article  Google Scholar 

  • Krajcik, J., Blumenfeld, P. C., Marx, R. W., Bass, K. M., Fredricks, J., & Soloway, E. (1998). Inquiry in project-based science classrooms: Initial attempts by middle school students. Journal of the Learning Sciences, 7(3–4), 313–350. https://doi.org/10.1080/10508406.1998.9672057.

    Article  Google Scholar 

  • Lachapelle, C. P., & Cunningham, C. M. (2019). Measuring fidelity of implementation in a large-scale research study. In Proceedings of the American Society for Engineering Education Annual Conference and Exposition, Tampa, FL.

    Google Scholar 

  • LaForce, M., Noble, E., King, H., Century, J., Blackwell, C., Holt, S., Ibrahim, A., & Loo, S. (2016). The eight essential elements of inclusive STEM high schools. International Journal of STEM Education, 3(1), 21. https://doi.org/10.1186/s40594-016-0054-z.

    Article  Google Scholar 

  • Lee, O., Penfield, R., & Maerten-Rivera, J. (2009). Effects of fidelity of implementation on science achievement gains among English language learners. Journal of Research in Science Teaching, 46(7), 836–859. https://doi.org/10.1002/tea.20335.

    Article  Google Scholar 

  • Lee, Y. J., & Chue, S. (2013). The value of fidelity of implementation criteria to evaluate school-based science curriculum innovations. International Journal of Science Education, 35(15), 2508–2537. https://doi.org/10.1080/09500693.2011.609189.

    Article  Google Scholar 

  • Lendrum, A., & Humphrey, N. (2012). The importance of studying the implementation of interventions in school settings. Oxford Review of Education, 38(5), 635–652. https://doi.org/10.1080/03054985.2012.734800.

    Article  Google Scholar 

  • McNeill, K. L., Marco-Bujosa, L. M., González-Howard, M., & Loper, S. (2018). Teachers’ enactments of curriculum: Fidelity to procedure versus fidelity to goal for scientific argumentation. International Journal of Science Education, 40(12), 1455–1475. https://doi.org/10.1080/09500693.2018.1482508.

    Article  Google Scholar 

  • Miles, M. B., Huberman, A. M., & Saldaña, J. (2014). Fundamentals of qualitative data analysis. In Qualitative data analysis (3rd ed.). Thousand Oaks: Sage.

    Google Scholar 

  • Mowbray, C. T., Holter, M. C., Teague, G. B., & Bybee, D. (2003). Fidelity criteria: Development, measurement, and validation. American Journal of Evaluation, 24(3), 315–340. https://doi.org/10.1080/09500693.2018.1482508.

    Article  Google Scholar 

  • O’Donnell, C. L. (2008). Defining, conceptualizing, and measuring fidelity of implementation and its relationship to outcomes in K–12 curriculum intervention research. Review of Educational Research, 78(1), 33–84. https://doi.org/10.3102/0034654307313793.

    Article  Google Scholar 

  • Offerdahl, E. G., McConnell, M., & Boyer, J. (2018). Can I have your recipe ? Using a fidelity of implementation (FOI) framework to identify the key ingredients of formative assessment for learning. Life Sciences Education, 17(4). https://doi.org/10.1187/cbe.18-02-0029.

    Article  Google Scholar 

  • Penuel, W. R., Fishman, B. J., Haugan Cheng, B., & Sabelli, N. (2011). Organizing research and development at the intersection of learning, implementation, and design. Educational Researcher, 40(7), 331–337. https://doi.org/10.3102/0013189X11421826.

    Article  Google Scholar 

  • Rudnick, M., Freeman, C., & Century, J. (2012). Practical applications of a fidelity-of-implementation framework. In B. Kelly & D. F. Perkins (Eds.), Handbook of implementation science for psychology in education (pp. 346–360). Cambridge: Cambridge University Press. https://doi.org/10.1017/CBO9781139013949.026.

    Chapter  Google Scholar 

  • Ruiz-Primo, M. A. (2006). A Multi-method and multi-source approach for studying fidelity of implementation. (CSE report no. 677). National Center for Research on Evaluation, Standards, and Student Testing (CRESST). Los Angeles.

  • Schneider, R. M., Krajcik, J., & Blumenfeld, P. (2005). Enacting reform-based science materials: The range of teacher enactments in reform classrooms. Journal of Research in Science Teaching, 42(3), 283–312. https://doi.org/10.1002/tea.20055.

    Article  Google Scholar 

  • Sechreset, L., West, S. G., Phillips, M. A., Redner, R., & Yeaton, W. (1979). Some neglected problems in evaluation research: Strength and integrity of treatments. In L. Sechreset, S. G. West, M. A. Phillips, R. Redner, & W. Yeaton (Eds.), Evaluation studies review annual. Thousand Oaks: Sage.

    Google Scholar 

  • Singer, J. E., Krajcik, J., Marx, R. W., & Clay-Chambers, J. (2000). Constructing extended inquiry projects: Curriculum materials for science education reform. Educational Psychologist, 35(3), 165–179. https://doi.org/10.1207/S15326985EP3503_3.

    Article  Google Scholar 

  • Songer, N. B., & Gotwals, A. W. (2005). Fidelity of implementation in three sequential curricular units. In Paper presented at Annual Meeting of the American Educational Research Association, Montreal, Canada.

    Google Scholar 

  • Stains, M., & Vickrey, T. (2017). Fidelity of implementation: An overlooked yet critical construct to establish effectiveness of evidence-based instructional practices. CBE—Life Sciences Education, 16(1). https://doi.org/10.1187/cbe.16-03-0113.

    Article  Google Scholar 

Download references

Acknowledgements

The authors wish to acknowledge Dr. Roxanne Moore and Jeff Rosen, who developed the STEM-ID curriculum and provided guidance on the definition of its critical components. The authors would also like to acknowledge: participating teachers, students, and schools; the AMP-IT-UP project team, including Dr. Marion Usselman, Sabrina Grossman, Jayma Koval, and Mike Ryan, for their input on this work; and, Emily Frobos and Olivia Shellman for assistance with research coordination.

Funding

This material is based upon work supported by the National Science Foundation under Grant No. 1238089. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation or Georgia Institute of Technology.

Author information

Authors and Affiliations

Authors

Contributions

JG led the design, data collection, and analyses and drafted major sections of the manuscript. MA advised on the design of the study and instruments, collected and analyzed data, and made major contributions to writing the manuscript. JL and SN collected and analyzed observation and interview data and contributed to writing the manuscript. All authors have read and approved the manuscript.

Corresponding author

Correspondence to Jessica Gale.

Ethics declarations

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gale, J., Alemdar, M., Lingle, J. et al. Exploring critical components of an integrated STEM curriculum: an application of the innovation implementation framework. IJ STEM Ed 7, 5 (2020). https://doi.org/10.1186/s40594-020-0204-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s40594-020-0204-1

Keywords