The U.S. Department of Education isn鈥檛 exactly known for its facility with metaphors. But a vivid image in a 71-page epitomizes the agency鈥檚 central contention that teachers need to have the ultimate power over how the technology is used in schools.
鈥淲e envision a technology-enhanced future more like an electric bike and less like robot vacuums,鈥 the department wrote in the report, released May 23. 鈥淥n an electric bike, the human is fully aware and fully in control, but their burden is less, and their effort is multiplied by a complementary technological enhancement. Robot vacuums do their job, freeing the human from involvement or oversight.鈥
In other words: While AI has great potential to help students learn more efficiently and make teachers鈥 lives easier by creating lesson plans, bridging achievement gaps through intelligent tutoring, or making recommendations about how to help individual students grasp a concept, educators should understand its limitations and be empowered to decide when to disregard its conclusions. The report calls this keeping 鈥渉umans in the loop.鈥
鈥淲e are seeing a dramatic evolution in ed tech,鈥 said Roberto Rodriguez, the assistant secretary for planning, evaluation, and policy development at the U.S. Department of Education. 鈥淓ducators have to be proactive in helping to shape policies, systems, and being engaged as AI is introducing itself into society in a more major way.鈥
That means teachers need to be just as aware of AI鈥檚 potential pitfalls as they are of its promise, the report contends. AI can take on biases in the data used to train the technology. For instance, a voice-recognition program used to measure reading fluency might give an incorrect picture of a student鈥檚 ability because it hasn鈥檛 been trained on their regional accent.
The technology is evolving quickly, Rodriguez said. He doesn鈥檛 want to see school districts fall behind in planning for it.
鈥淚 am worried that we are not moving quickly enough [in setting school level policies and district level policies] that both capture the powerful potential that AI provides, but also minimize the risks of these tools in classrooms and in learning for students,鈥 Rodriguez said.
The report was informed by four listening sessions conducted last summer and attended by more than 700 experts and educators.
Other recommendations include:
Align AI models to a shared vision for education. Like any tool used to improve student achievement or manage classrooms, AI-powered technology needs to be based on evidence and aligned with what educators are trying to accomplish in the classroom.
Design AI using modern learning principles. AI tools need to build on learners鈥 strengths and help students develop so-called 鈥渟oft skills鈥 like collaboration and communication, as well as include supports for English learners and students in special education, the report contends.
Inform and involve educators. Teachers need to be at the table when developers create AI-powered technologies aimed at K-12 schools. Educators also must understand that AI can make mistakes, so they need to be encouraged to rely on their own judgement. 鈥淪ometimes people avoid talking about the specifics of models to create a mystique,鈥 the report says. 鈥淭alking as though AI is unbounded in its potential capabilities and a nearly perfect approximation to reality can convey an excitement about the possibilities of the future. The future, however, can be oversold. 鈥 We need to know exactly when and where AI models fail to align to visions for teaching and learning.鈥
Prioritize strengthening trust. Educators haven鈥檛 had a universally positive experience with learning technology. If school districts want to take advantage of the promise of AI tools, they need to build trust in the tech, while making clear it鈥檚 not infallible. During the listening sessions, the department found that 鈥渃onstituents distrust emerging technologies for multiple reasons,鈥 the report said. 鈥淭hey may have experienced privacy violations. The user experience may be more burdensome than anticipated. Promised increases in student learning may not be backed by efficacy research. Unexpected costs may arise.鈥