Elsevier

Assessing Writing

Volume 50, October 2021, 100563
Assessing Writing

Repurposing plagiarism detection services for responsible pedagogical application and (In)Formative assessment of source attribution practices

https://doi.org/10.1016/j.asw.2021.100563Get rights and content

Introduction

Plagiarism detection services and software (PDSs) were developed in the mid-1990s to catch Internet-based types of copy and paste plagiarism (Vie, 2013b). Now ubiquitous in higher education, these tools often automatically integrate into popular LMSs like Canvas, Moodle, and Blackboard. Initially, PDSs were designed to “detect” and “protect” institutions from the novel, unexpected ways students were allegedly cheating (Homepage, 2016). The advent of the Internet ushered into academia unprecedented access to global information sharing, abundant (and multimodal) public writing, and the constant emergence of new genres, writing styles, and citation practices. Decades later, educators are still grappling with the pedagogical reverberations and consequences of online writing collaboration, a subject that continues to compound the complicated the layers of teaching and assessing source attribution in digital contexts.

Though PDSs appeared an attractive technological solution to these moving and intertangling assessment problems, they also introduced numerous pedagogical consequences for students and teachers. As students’ writing processes have evolved and changed in digital spaces, so have researchers’ and teachers’ understandings of the writing process and its collaborative, unruly, and rhetorically situated nature (Vojak, Kline, Cope, McCarthey, & Kalantzis, 2011). PDSs are not equipped to accurately assess plagiarism or capaciously address it in all the forms and modes students compose in.1 However, these programs offered teachers and administrators ways to cope with labor/budgetary constraints, quickly proliferating communication technologies and new and unfamiliar writing situations (Marsh, 2004; Vie, 2013)—ones they often did not understand how to assess for plagiarism.

As students increasingly write and collaborate with machines in educational and professional contexts, teachers must help students understand how technologies are framing writing, what negotiations they are or are not willing to make with PDSs, and how they are impacting their writing processes and confidence as writers (Gallagher, 2017, Hart-Davidson, 2018, Kennedy, 2016, Kennedy, 2017). As the COVID-19 pandemic continues and anti-cheating technology burgeons (Harwell, 2020), we must consider how the technology we invite into our classrooms compels teachers and students to write or act in particular ways (Beck, 2018, Reyman, 2017), and what those actions and consequences are implying we value when we assess—and teach—student writing.

Section snippets

Understanding the operation: how do PDSs function?

Popular PDS programs, such as SafeAssign and Turnitin, deploy algorithms that search internet databases, programmatic repositories of student writing, and paper mill papers to find strains of text that matches students’ work (Purdy, 2005). PDSs are implemented at an instructor’s or discretion in most cases, but students do not have an option to opt out of working with a PDS unless they ask their instructor. In many cases, learners engage an originality report on high-stakes, final assignments

The limitations of PDSs

Plagiarism is challenging to address with technology because it is a concept at odds with what we know about the social construction of language (Vojak et al., 2011). Writers must be inventive, but they must also acknowledge others’ work and build on it to produce something “original” while processing and learning new information. PDSs are designed to privilege outdated authorship constructs that suggest writing is a rigid and solitary practice performed by a tortured, lone genius in a vacuum (

Possibilities for student-driven assessment & collaborations with PDSs

What follows are suggestions for repurposing a PDS to help students learn how to rhetorically understand citation practices, research processes, and how to interpret data about their own writing. Inviting students to interrogate how these programs flatten complexity and oversimplify plagiarism can help build their awareness about what they give up and gain as writers when they work with PDSs. Rather than focusing on one specific program, I overview shared technological features across different

Informate data about student writing with students

  • a.

    If an instructor informates data about student writing from an originality report and shares that process with students, it will help them model the same approach when they look at the report without their teacher’s guidance. In practice, this can take shape in many different ways. For example, if a student receives a high percentage composite score on a report, a teacher can anonymously share the report with their class and discuss why the score was high, why that is arbitrary, and what a

Encourage students to revise and draft with the PDS to minimize anxiety and high-stakes

  • a.

    In working with the informated data about their writing, students will practice revising and negotiating their authorial choices when collaborating with assessment technology. If a PDS is framing their work as plagiarized, students will learn how to protect their authorial integrity and pushback, revise, and adapt their work to meet academic standards of writing needed to graduate. However, this approach will only work if an instructor encourages multiple revision processes and does not attach

Pedagogy of resistance

  • a.

    As Vie suggested in her 2013a work on resisting PDSs, instructors must encourage students to critically engage with how these technologies are impacting them, their writing, and their confidence. Students are writing with machines with increasing frequency; they have to know how to navigate these dynamics and the negotiations. Here are some questions to consider asking students to help them interrogate the dynamics at play when they work with PDSs

    • i.

      What does the PDS report add to your writing?

Futures

As we begin reshaping what education will look like in decades to come after we address the consequences of a post-COVID world, we have to consider a more effective balance between humans and machines that plays to each agent’s strengths in writing assessment. InFormative assessment offers us an act of “pedagogical resistance” (Vie, 2013) when teachers must work with an educational technology that typically disenfranchises students. These concerns are part of a larger conversation about the

Jordan Canzonetta is an assistant professor of English Studies at Lewis University in Romeoville, Illinois. Her research focuses on algorithmic rhetorics, automation, plagiarism, surveillance, disability rhetorics, online writing pedagogy, and educational technology design and has been published in the Journal of Writing Assessment.

First page preview

First page preview
Click to open first page preview

References (30)

  • Council of Writing Program Administration (CWPA) 2003, January. Defining and avoid plagiarism: The WPA statement on...
  • D. Edwards et al.

    Only geniuses can be writers

  • Eli Review Retrieved August 2, 2021 from:...
  • Grabill J. Do we learn best together or alone? Your life with robots. Computers & Writing Conference, May 20, 2016....
  • W. Hart-Davidson

    Writing with robots and other curiosities of the age of machine rhetorics

  • Cited by (1)

    Jordan Canzonetta is an assistant professor of English Studies at Lewis University in Romeoville, Illinois. Her research focuses on algorithmic rhetorics, automation, plagiarism, surveillance, disability rhetorics, online writing pedagogy, and educational technology design and has been published in the Journal of Writing Assessment.

    View full text