When Jeremy Sell saw the word “poignant” spelled correctly in an essay, the jig was up.
Sell, a high school English teacher in California, already suspected that his student had used a generative AI tool like ChatGPT to compose the assignment.
The writing was a little too perfect. The voice was flat. “Too vanilla, bland, smooth,” in Sell’s description. The high-level vocabulary word—not one Sell expected from this particular student—was just further confirmation.
Since ChatGPT’s introduction a year ago, Sell and other educators have used their grasp of students’ writing abilities—a Spidey sense honed over years in the classroom—to figure out whether an assignment had an AI ghost writer.
If I've got a kid who's willing to put in the time and effort to train a generative AI model on his own writing. ... There are some skills there.
But a newer, free version of ChatGPT released in the spring, ChatGPT 3.5, and even newer tools, like Google’s just-unveiled Gemini are scuttling that strategy. AI is rapidly becoming too sophisticated for even veteran educators’ detective skills, educators and experts say.
Enterprising students can put their own writing into these tools and direct it to emulate their voice and style, even adding in some spelling and grammar mistakes to make the cheating less obvious. Already, AI can produce a reasonable facsimile of a 4th grader’s essay.
And the technology is bound to get even better at imitating an individual author’s voice, experts say. That may leave even educators who consider themselves solid AI-sniffing bloodhounds in the lurch. After all, even linguists have difficulty distinguishing between a human-crafted piece of writing and one produced by AI,
“I definitely used to rely on that like [a] sixth sense of ‘this doesn’t sound like you. I’ve read things you’ve written,’” said Carly Ghantous, a humanities instructor at Davidson Academy Online, a private virtual school. “Now that it’s clear that AI could generate that same [voice], it is a little bit unnerving to me as an English teacher who does require a lot of writing.”
To be sure, when ChatGPT emerged about a year ago, it was “likely true” that teachers could use their knowledge of student writing—both students at a particular grade level, or their own individual students—to root out cheaters, said Stacy Hawthorne, the chief academic officer for Learn21, a nonprofit organization that works with schools to improve their use of education technology.
“It’s definitely not true now,” Hawthorne said.
‘Writing generic responses to generic prompts’
As an experiment, Hawthorne prompted ChatGPT to write in the style of a 4th-grade student, responding to a question about whether old objects—like paintings—should be thrown away or preserved. She asked the tool to consider tone, voice, sentence structure, and transitions. She also provided three writing samples from real 4th grade students, taken from the Utah Department of Education’s website.
ChatGPT’s first result had the right tone and style, but the spelling and grammar were too carefully crafted for a typical 4th grade student. Hawthorne gave the AI tool that feedback, and seconds later, ChatGPT spit out a version with more errors.
While it’s hard to imagine a 4th-grader navigating the tool successfully enough to cheat this way, it’s not much of a leap to envision a high schooler feeding ChatGPT, Gemini, or another generative AI tool some of their most recent work and asking for an essay on, say, the causes of the civil war, in their voice, Hawthorne said.
It’s unclear how many students have caught on to AI’s power to emulate their writing yet.
Sell doesn’t expect many of his students to have “the wherewithal to do that”—at least not yet. In fact, he wondered if a student’s ability to get an essay out of a generative AI model that plausibly imitates their writing voice might be a demonstration of a different kind of know-how.
“If I’ve got a kid who’s willing to put in the time and effort to train a generative AI model on his own writing,” Sell said. “There are some skills there.”
But he expects that it will get exponentially easier as the tools mature, making cheating much harder to catch. That argues for rethinking how writing is taught, Sell said.
“We’re going to have to come to some sort of consensus ourselves about what would be acceptable use. AI is not going anywhere,” Sell said. “Are we going to keep asking them to write really kind of generic responses to generic prompts that we’ve used for years and years and years?”
‘Writing can be a very messy process’
Even as AI gets better at emulating individual students’ writing styles, there are still workarounds to ensure that students are doing their own work, educators and experts say.
For instance, Ghantous already requires students to show their work at different stages of the writing process, including brainstorming documents, outlines, and first drafts. That way, she will at least be able to catch major changes to their initial direction, Ghantous said.
Of course, that is unless students were “using AI through that entire process, in which case that sounds like a lot of work” on the student’s part, Ghantous said.
Teachers can also use software that looks at student keystrokes to see if they actually typed a piece of writing themselves or if they cut and pasted it into a document, said Peter Gault, the founder and CEO of Quill, a nonprofit education technology literacy organization. He recommends DraftBack, which must be installed in advance of writing an assignment.
The tool can show how much a student is revising, how they are crafting individual arguments, and what their process for writing an outline looks like.
To be sure, it’s fair to question whether teachers should be getting such a thorough picture of their students’ writing process, Gault added.
“Writing can be a very messy process,” Gault said. “The big ethical consideration is whether a teacher should or should not be able to see the students’ whole drafting process or not. But from a perspective of trying to see authentic writing happening, this is one way of doing so.”
Will the real F. Scott Fitzgerald please stand up?
AI’s ability to mimic a writer’s voice can also serve as a teaching tool, exposing students to both the uses of the technology and the idea of a writer’s voice, Sell said. He’s teaching the high school English class staple The Great Gatsby by F. Scott Fitzgerald this school year and wondered if students could put some passages from the book into an AI tool and see how close it comes to emulating Fitzgerald’s vivid, prose.
English teachers could do the same with other authors that have a distinctive style, such as Ernest Hemingway, William Faulkner, or Virginia Woolf, Sell suggested.
“You can use it [to] teach them about voice and what is it that makes this piece supposedly in the style of Hemingway? Where does [AI] fail to capture the [author’s] voice?” Sell said. “And as you’re having those kinds of discussions, then it opens up possibilities for students to think about ‘OK, so what is my writer’s voice?’ Because, you know, they might have written these five-paragraph essays, but is that genuinely their voice?”
Ghauntous agrees that could be a good way to use the tool productively. Still, AI’s ability to imitate individual writers—including students—convincingly is just another argument in favor of figuring out how to teach writing in a world where generative AI exists, she said.
“I definitely think it requires us as writing teachers to reexamine what we’re asking students to do and why we’re asking them to do it,” Ghantous said. “Is what we’re asking them to produce the best way to determine that they have the skills that we think that they need to have?”
Students will almost certainly need to know how to use AI as an effective writing partner in their future careers, Ghantous added.
“I don’t think AI is going to go away when they go to their jobs,” she said. “They’re probably going to be expected to utilize it in some way after they leave high school or college. Maybe our focus shouldn’t be on trying to prevent them from using it but teaching them how to use it correctly and ethically.”