As a high school administrator, I am often the person who has to respond to a teacher鈥檚 request to address cheating. Since the introduction of artificial intelligence detection tools, I have watched as efforts to catch cheating often do more harm than good. Not only are these tools unreliable, but they can also significantly damage trust between student and teacher.
Although some students definitely cheat, I have also learned that . Sadly, I have seen many educators react to all cheating as a personal affront rather than understanding the underlying causes. Adding AI detection tools to the mix can create an additional layer of stress and disappointment for all involved, even as recent research suggests .
Because we have come to rely on computers so much, educators might be mistakenly led to believe that AI detection tools could eradicate cheating altogether. This form of techno-chauvinism鈥攖hat AI can eradicate an age-old concern鈥攊s causing additional harm and angst among students and teachers who are grappling with this new reality.
Since the explosion of ChatGPT and other AI tools beginning in late 2022, many educators鈥攊ncluding the teachers at my school鈥攈ave turned to AI detection tools to catch plagiarism. In many of the cases I鈥檝e seen, students whose writing was flagged by AI detection vehemently denied cheating.
Unfortunately, the damage and harm of accusing students of plagiarism based on inconclusive results have been extreme in some cases. For example, when one writing assignment was flagged for a 51 percent likelihood of AI use, the teacher asked administration to send that student to another location to be individually supervised for every future test. Rather than taking the result as probability, which I would recommend, the teacher considered that 51 percent as conclusive because it was machine-generated.
I have also met with irate parents who demanded that their child be removed from a class because the teacher inquired or implied that the student might have used an AI tool to cheat. These parents and students have considered the presence of an AI detection tool as a grave threat to building a positive teacher-student relationship.
I am less interested in whether AI detection tools can detect cheating or not; what does interest me, however, are the opportunities that AI can or will afford our students to improve their learning.
I believe that AI can be used as a tool to create individualized and differentiated learning opportunities for students. Just as students are now able to use calculators when taking standardized tests, I imagine a world where AI could be available to all students and be readily used to assess students鈥 creativity and problem-solving skills. Teaching the students to use AI to iterate might be the best use of these powerful new tools.
To achieve that goal鈥攁nd ultimately render cheating concerns irrelevant鈥攕chools should consider the following:
1. Establish a clear AI policy that clarifies what is allowed for which assessment.
One of the challenges that I faced while working to support my teachers as they address cheating concerns was that our policy language didn鈥檛 necessarily specify what was allowed. During one of our investigations into cheating, we discovered that some students were genuinely confused, because they thought that using tools like Grammarly, an AI writing-assistance platform, was allowed.
Even as the speed of advancement of these tools requires us to adjust our policies frequently, it is necessary to have clear guidelines of what is allowed. Students deserve the chance to learn how to use these tools productively without being labeled a 鈥渃heater.鈥
2. Involve the students in creating an AI policy.
Ultimately, school policies exist to support student learning, and involving students in the policy-creation process can build their AI literacy skills. It also gives our students the opportunity to understand why certain policies exist and how adapting to them can be to their benefit rather than purely punitive.
3. Establish a policy-review process and revisit the policies governing AI use.
To address the impact of rapidly changing technology, schools should regularly review the AI classroom policy. Once again, I would advocate soliciting student input in the process whenever possible.
4. Consider certain formative and summative assessments that are plagiarism-proof.
Teachers can use AI to expand a and opportunities in the classroom. I encourage teachers to use AI tools in classroom instruction that focuses on what students have learned rather than asks regurgitation of discrete pieces of information that they memorized.
5. Give students a dignified way to recover from their mistakes鈥攊ncluding cheating.
We all recognize that making mistakes is a part of being human and that learning is a deeply human endeavor. Our cheating policies should reflect this by creating a dignified path for our students who have cheated. , that path to recovery is particularly important. After all, learning requires continuous yet productive failures and recoveries.
The most important aspect of my job as a school leader is to keep our students and staff safe, and I consider their emotional safety to be just as important as their physical safety. Just like the power tools in shop class, AI can do incredible harm to our students if not used appropriately. Let鈥檚 all continue to look at ways to keep our students safe as we implement AI in schools.