Elsevier

Computers & Education

Volume 72, March 2014, Pages 48-58
Computers & Education

Student teacher communication and performance during a clinical experience supported by a technology-enhanced cognitive apprenticeship

https://doi.org/10.1016/j.compedu.2013.10.011Get rights and content

Highlights

  • Study examines student teachers in a technology-enhanced cognitive apprenticeship (TECA).

  • Performance assessed with valid and reliable state-wide portfolio assessment.

  • TECA students had higher planning scores (statistically significant).

  • TECA students communicated with experts more (statistically significant).

  • Results consistent with prior studies, support efficacy of design principles.

Abstract

This study is the third in a series of design-based research on a technology-enhanced cognitive apprenticeship (TECA) that uses a variety of technology (e.g. video, discussion boards, performance support) to support triad member activity during the clinical experience. The purpose of this study was to examine the differences in communication and performance among student teachers who participated in the TECA during a year-long clinical experience and those who did not. Overall, performance scores were higher among students in the TECA; planning scores were higher at a statistically significant level. Communication reports suggest that online discussions through both public and private channels contributed to these results. Findings were consistent with prior studies of the TECA and supported the efficacy of key design elements. Implications for teacher education and the design of TECAs are discussed.

Introduction

There is a clear and growing call to improve the supervision of student teachers during the clinical experience with technology. The National Council for Accreditation of Teacher Education's Blue Ribbon Panel (Blue Ribbon Panel, 2010) suggested that teacher preparation should include opportunities to participate in a larger community of learners connected through technology. Gomez, Sherin, Griesdorn, and Finn (2008) and Lieberman and Mace (2010) similarly suggested that student teachers would benefit from making use of technology to receive feedback and share experiences with a wider variety of peers and experts. These uses of technology seek to achieve a common goal – to improve the preparation of teachers by improving dialog and discourse about teaching practice among student teachers and their community of experts and peers.

The call to improve teacher preparation stems from longstanding issues with the triad model (student teacher, cooperating teacher, and supervisor) commonly associated with teacher supervision. The triad model is an apprenticeship model in which a student teacher learns under the guidance of a practicing teacher – that is, the cooperating teacher – in the field. The student teacher is typically monitored and periodically observed by a supervisor from the university to determine whether he or she has demonstrated the knowledge and skills needed to earn a teaching credential. A common issue with the triad model is that the underlying cognitive processes associated with teaching are not consistently demonstrated or communicated to student teachers by the experts in their triad (Feiman-Nemser, 2001, Tang, 2003, Valencia et al., 2009). Cooperating teachers and supervisors are often provided little guidance or support for helping student teachers make practical use of the theories they learn at the university (Darling-Hammond, 2006, Levine, 2011, Zeichner, 2010). Student teachers are often left to make meaning of their clinical experience without the support of experts (Fisch & Bennett, 2011). As a result, the clinical experience becomes highly variable in quality and efficacy across student teachers and the institutions that prepare them to teach (Darling-Hammond, 2006).

Technology-enhanced cognitive apprenticeships (TECA) have the potential to address the problems that triad members encounter during the clinical experience and, in turn, the quality of teacher supervision. A technology-enhanced cognitive apprenticeship is an environment where the theories and methods of cognitive apprenticeship (e.g. modeling, coaching, scaffolding, reflection, and community-building) are used as a framework for incorporating technology into the clinical experience and improving learner outcomes (Ghefaili, 2003, Wang and Bonk, 2001). A cognitive apprenticeship is different from a traditional apprenticeship in that the skills that a novice must learn are not fully observable – rather, the focus is on learning the underlying cognitive processes that others have come to master (Collins, Brown, & Holum, 1991). This draws heavily upon Vygotsky's (1978) work in socio-cultural theory. Under this theory, learning occurs through frequent dialog with oneself and others. Dialog helps learners better understand the meaning of knowledge and its relationship with the social and cultural norms within a given context. Technology serves as a tool for learners to connect this knowledge with the world in which they live. Thus, TECAs have the potential to improve the clinical experience by generating rich dialog about teaching and teaching practices as they are occurring during the clinical experience.

Many TECAs focus on improving the methods of cognitive apprenticeship – in particular, coaching and feedback. Some researchers have reported positive results from this focus (see Jetton, 2004, Lee and Wu, 2006, Liu, 2005, White and Cornu, 2002, Wu and Lee, 2004). For example, White and Cornu (2002) found that 120 student teachers experienced lower levels of stress during the clinical experience when they used email communication to receive support and advice from peers and experts. Others, however, have reported that their use of technology did little to improve teacher learning or performance (Clift et al., 2001, Levin and Waugh, 1998, Pratt, 2008, Price and Chen, 2003). Price and Chen (2003) reported that their attempt to use online discussions to improve coaching and feedback during the clinical experience resulted in very little communication among participants.

One reason for these mixed results may be the manner in which a TECA is designed. Participants are more likely to derive benefits from a TECA when they understand what they are expected to do and have ample support when doing it (Clift et al., 2001, Fisch and Bennett, 2011). A TECA that focuses solely on coaching and feedback, then, may not improve teacher outcomes because participants may not know what to discuss or how to do so effectively. Ghefaili's (2003) framework suggests that a successful TECA consists of three essential elements of cognitive apprenticeship: the methods, content, and social aspects of learning. It is likely that student teachers immersed in such an environment would have greater and more purposeful exposure to the thinking of experts and peers than one that focuses more narrowly on increasing the frequency of coaching and feedback alone.

There is limited but promising evidence that TECAs built around multiple elements of cognitive apprenticeship can improve teacher outcomes (Fisch and Bennett, 2011, Lee and Wu, 2006, Liu, 2005, Sherin and van Es, 2005, Wu and Lee, 2004). Wu and Lee (2004) found that 37 student teachers were better able to evaluate their own and others' teaching habits after engaging in a TECA that offered instructional modules, private discussion with experts about their own teaching, and public discussion about teaching with the broader community of experts and peers. Liu (2005) found that 24 student teachers were significantly better at planning and had more positive attitudes about teaching after participating in a web-based environment that embedded online coursework, modeling and coaching, and performance support into the clinical experience. Fisch and Bennett (2011) reported that 16 student teachers exhibited deeper reflection and critical thinking about their own teaching practice during the clinical experience when their online discussion was guided by specific course assignments. These results suggest that TECAs can be used effectively to improve student teacher performance when designed to provide all triad members with support and guidance as they engage in apprenticeship activities.

Making practical use of prior studies of TECAs is problematic. Many studies fail to clarify if or how a TECA was integrated within existing student teacher coursework (Hammond, 2005). More problematic is a reliance on qualitative accounts of teacher's attitudes toward technology or feelings about their learning rather than their teaching performance (Gentry, Denton, & Kurz, 2008). Studies that do focus on teaching performance often fail to account for factors outside the TECA that can influence one's ability to teach (Gentry et al., 2008, Kopcha and Alger, 2011). Far fewer examine the communication choices of learners while engaged in TECAs (Clift et al., 2001). There is a need for research that examines communication and performance among student teachers engaged in a TECA during the clinical experience. Examining the relationship between communication and performance will not only address this need but also help to establish empirically-based design principles for developing TECAs (Dennen & Burner, 2008, pp. 425–439; Hixon & So, 2009).

The purpose of this study was to examine the difference in communication and performance among student teachers who participated in a TECA during the clinical experience and those who did not. The TECA, called eSupervision, is an instructional program that uses technology to enhance the methods, content, and social aspects of cognitive apprenticeship to support triad members during the clinical experience. The program has been studied as part of an ongoing design-based research effort because of the focus on developing a theory-driven instructional solution to a specific problem in a local context (Reeves, Herrington, & Oliver, 2005). This study examines the third iteration of that effort. Studying educational phenomena over multiple iterations of an instructional design effort offers insight into the efficacy of design decisions as well as practical considerations for enacting them in applied contexts (Reeves et al., 2005).

In the prior iteration of eSupervision, Kopcha and Alger (2011) examined the teaching performance and attitudes of 38 student teachers. Though the difference was not statistically significant, eSupervision students performed better than non-eSupervision students on a statewide test of their teaching knowledge and ability. Cooperating teachers and supervisors, however, struggled to provide timely and meaningful feedback using the assessment tool embedded within the online system. The authors noted that this issue was a factor that may have contributed to the lack of statistical significance.

The current study first replicates the prior study by examining the effects of a TECA on student teacher performance during the clinical experience. Performance scores were compared between two cohorts of student teachers – one that received eSupervision and one that did not. Teacher self-efficacy and the quality of guide teaching were accounted for in the research design because of their influence on student performance during the clinical experience (Tang, 2003, Tschannen-Moran and Hoy, 2007).

The current study also improves upon the previous study in several ways. First, the current study introduces a mechanism for providing student teachers with more timely and relevant coaching and feedback through private discussion boards. Since the current iteration is the first to make use of private discussion boards, the results of this study provide much needed insight into the effects of specific designs associated with coaching and feedback. Both are critical components of the clinical experience that can positively influence teacher performance (Anderson & Stillman, 2013). In addition, the current study examines the differences in overall communication between students who participate in eSupervision and students who do not. Examining the relationship between communication and performance in a single setting addresses a noted gap in the current literature on TECAs in teacher education.

The research questions guiding this study were:

  • 1.

    How does teacher performance differ between student teachers who participate in a TECA and those who do not?

  • 2.

    How does the method of communication and frequency of communication about specific teaching practices differ between groups?

  • 3.

    Did student teachers use the discussion boards (public and private) as directed within the instructional program?

Section snippets

Participants

Fifty-four student teachers (24 eSupervision; 30 non-eSupervision) volunteered to participate in this study. The student teachers were engaged in a yearlong clinical experience as part of a post-baccalaureate secondary-level teaching credential program in a large university in the Southwest. Participants in both groups had similar levels of technology skills. During the first semester of the study, all student teachers completed a semester-long introductory technology course at the University.

Results

An underlying assumption of analysis of covariance is that there is no relationship between the dependent variable and the covariate within all levels of each factor. This assumption, called the Homogeneity of Regression Slopes, is tested prior to conducting analysis of covariance using a customized general linear model (Green & Salkind, 2005). The results of the test indicated that the assumption was not violated in this study. The results are presented below.

Discussion

The current study was conducted on the third iteration of a technology enhanced cognitive apprenticeship (TECA) called eSupervision. In the previous study of eSupervision, student teachers scored the same or slightly higher than non-eSupervision students on each of the abilities measured by PACT (i.e. planning, implementing, assessing, and reflecting) after accounting for self-efficacy and quality of support from the cooperating teacher (Kopcha & Alger, 2011). A similar result was achieved in

Conclusion

There is a growing interest in using technology to enhance the apprenticeship that occurs during the preparation of teachers. While continued research is warranted, the results of this study suggest that the supervision of the clinical experience can be augmented with technology in a way that capitalizes on the strengths of the triad model while addressing some of its longstanding weaknesses. At the very least, this study indicates that teacher educators can use technology to improve the manner

Theodore (TJ) Kopcha is an assistant professor in the program of Learning, Design, and Technology at the University of Georgia. He received his Ph.D. from Arizona State University and has written several papers on teacher professional development in K12 and higher education.

References (49)

  • Blue Ribbon Panel

    Transforming teacher education through clinical practice: A national strategy to prepare effective teachers

    (2010)
  • J. Cohen

    Statistical power analysis for the behavioral sciences

    (1988)
  • A. Collins et al.

    Cognitive apprenticeship: making thinking visible

    American Educator

    (1991)
  • L. Darling-Hammond

    Constructing 21st-century teacher education

    Journal of Teacher Education

    (2006)
  • V.P. Dennen et al.

    The cognitive apprenticeship model in educational practice

    (2008)
  • S. Feiman-Nemser

    From preparation to practice: designing a continuum to strengthen and sustain teaching

    Teachers College Record

    (2001)
  • A. Fisch et al.

    Independence and interdependence: an analysis of pre-service candidates’ use of focused assignments on an electronic discussion forum during the initial field experience

    Interdisciplinary Journal of Teaching and Learning

    (2011)
  • H. Fives et al.

    Examining the factor structure of the Teachers’ Sense of Efficacy Scale

    The Journal of Experimental Education

    (2009)
  • J. Fraenkel et al.

    How to design and evaluate research in education

    (2006)
  • L.R. Gay et al.

    Educational research: Competencies for analysis and interpretation

    (2009)
  • L.B. Gentry et al.

    Technologically-based mentoring provided to teachers: a synthesis of the literature

    Journal of Technology and Teacher Education

    (2008)
  • A. Ghefaili

    Cognitive apprenticeship, technology, and the contextualization of learning environments

    Journal of Educational Computing, Design, and Online Learning

    (2003)
  • L.M. Gomez et al.

    Creating social relationships: the role of technology in preservice teacher preparation

    Journal of Teacher Education

    (2008)
  • S. Green et al.

    Using SPSS for Windows and Macintosh: Understanding and analysing data

    (2005)
  • Cited by (21)

    • Does coaching, mentoring, and supervision matter for pre-service teachers’ planning skills and clarity of instruction? A meta-analysis of (quasi-)experimental studies

      2021, Teaching and Teacher Education
      Citation Excerpt :

      The use of digital tools and peer support during the practicum have emerged in the last years to make the collaboration between CTs and PSTs more efficient and to allow for more flexibility in supporting PSTs' learning (Barnett, Keating, Harwood, & Saam, 2002; Burns et al., 2016). An increasing number of studies have investigated how PSTs advance their instructional skills through coaching, mentoring, and supervision that integrates digital tools to provide real-time feedback in coaching and mentoring, like online video-conferences, video-based feedback, and online discussion forums (Burns et al., 2016; Kopcha & Alger, 2014; Rock et al., 2012; Weber, Gold, Prilop, & Kleinknecht, 2018). These technology-based elements can offer additional support and different sources of feedback for PSTs’ learning during the practicum (Burns et al., 2016).

    • A review on cognitive apprenticeship in educational research: Application for management education

      2020, International Journal of Management Education
      Citation Excerpt :

      For example, the model has been used for a teacher-student tuition program in undergraduate music education (de Bruin, 2019), a performance consulting program in a management class (Darabi, 2005), qualitative research methods in a doctoral program (Exter & Ashby, 2019), and writing programs for grant proposals in graduate school (Ding, 2008). Notably, several studies have reported education programs for pre-service and in-service teachers based on cognitive apprenticeship theory (Dichey, 2008; Hosenfeld et al., 1996; Kopcha & Alger, 2014; Lee, 2011; Liu, 2005; Peters-Burton et al., 2015). Greer et al. (2016) developed a competency-based program for teaching development based on cognitive apprenticeship in the following way.

    • Discipline specific online mentoring for secondary pre-service teachers

      2015, Computers and Education
      Citation Excerpt :

      Dansky (1996) identified that “group-level mentoring emerges from the dynamics of the group as a whole, rather than from a relationship with one specific person” (p. 7). This is affirmed by Kopcha and Alger (2014) who suggested that “[d]ialog helps learners better understand the meaning of knowledge and its relationship with the social and cultural norms” (p. 49) and that while supervising teachers find it difficult to provide meaningful feedback to pre-service teachers; timely feedback and coaching can be provided through group online discussion. They also revealed that pre-service teachers and practicing teachers who contribute to online discussions have better teaching and learning outcomes due to the increased “exposure to expert feedback and thinking” (Kopcha & Alger, 2014, p. 55).

    View all citing articles on Scopus

    Theodore (TJ) Kopcha is an assistant professor in the program of Learning, Design, and Technology at the University of Georgia. He received his Ph.D. from Arizona State University and has written several papers on teacher professional development in K12 and higher education.

    Christianna Alger is a Professor Emeritus from San Diego State University in the School of Teacher Education.

    View full text