2023 was the year of AI for K-12 education.
Artificial intelligence and its use had been gradually gathering momentum for years, but when ChatGPT arrived, AI suddenly became easily accessible to everyone. Anyone can use ChatGPT and other tools like it, which hold humanlike conversations and instantly answer seemingly any prompt.
Since then, K-12 educators across the country have been thinking about what and how much of a role AI should play in instruction, especially as experts say today’s students need to learn how to use the technology effectively to be successful in future jobs.
ChatGPT has been around for about a year, and there are teachers using it to craft lesson plans, communicate with parents, and show students how AI technologies work. Still, many educators say they are not prepared to teach with and about AI and have many questions about the benefits and drawbacks of the technology for teaching and learning.
Below are answers to five frequently asked questions about AI in K-12 education.
1. What are the dangers of AI?
AI tools are trained at a certain time, and the data sets they are trained on are not updated regularly, so these tools can provide outdated information or can fabricate facts when asked about events that occurred after they were trained.
And because the data sets on which the AI tools are trained contain biased information, that is also included in the responses generated by the tools. These AI tools, if left unchecked, could amplify harmful stereotypes and fuel misinformation and disinformation.
To combat the inaccuracies that come with using these AI models, some education organizations are focusing on a version of the technology some are calling “walled-garden AI.” A walled-garden AI is trained only on content vetted by its creator, instead of all the unchecked content all over the internet.
2. How can educators incorporate AI in the classroom?
Experts agree that it’s important for educators to play around with the AI technologies first before introducing them into the classroom so they can get a sense of the strengths and weaknesses of the tools. Some teachers are already using AI to create lesson plans, provide feedback to students, and communicate with parents.
Students need to learn how to use AI technologies as assistants or advisers, experts say. One way to do that would be for teachers to allow the use of ChatGPT for class assignments but require students to acknowledge and document how it was used. For example, a student who used ChatGPT to get feedback on a draft essay can explain which of the tool’s suggestions she agreed with and which ones she didn’t. By using this approach, students can learn how to use the tool as a partner, instead of having it do all the work for them. It could also build on students’ ability to evaluate and analyze writing.
3. Can AI help diverse learners?
Some teachers have found uses for ChatGPT and similar tools that help diverse learners with their assignments. For example, teachers can use ChatGPT to reduce Lexile levels—the measure for how difficult a text is—for English learners. Special education teachers are already using AI to minimize paperwork and generate individualized education programs, or IEPs.
However, teachers need to remember that generative AI tools can spit out biased or incorrect information. Researchers also found that some AI cheating-detector tools incorrectly flagged writing by non-native English learners as being AI-generated.
In addition, special education services often include sensitive information that would be risky or potentially even illegal to share on a publicly accessible AI platform that absorbs all the data it receives.
4. What is the best way to pursue professional development on this topic?
To get educators more comfortable with AI, they need time to use the technology as teachers and as learners, time to try different approaches with students, and opportunities to collaborate with colleagues and talk about what works and what doesn’t, according to experts. Teacher-preparation programs can also start to lay a foundational understanding of AI and start cultivating skills for effectively harnessing those tools so by the time teachers get in the classroom, they are ready to integrate AI into students’ learning experiences.
5. What policies have districts put into place to address AI use in classrooms?
Most districts still haven’t set clear policies or guidelines about the proper use of AI tools in the classroom. Just two states, California and Oregon, have provided official AI guidance to schools, according to an analysis by the Center on Reinventing Public Education at Arizona State University. Part of the reason for that could be that the technology is changing so quickly and educators fear policies they set will be outdated not long after they’re released.
When crafting AI-use guidelines, two major issues to address include ensuring teachers understand AI’s strengths and weaknesses and keeping student data safe, educators say. Guidelines should also address student academic integrity, which is one of teachers’ major concerns.
Several education organizations have created checklists and guidelines for districts to use as they craft policies around generative AI. from the Consortium for School Networking and the Council of the Great City Schools and from the Teach AI initiative are two examples.
During this process, it’s important to include teachers, students, and parents, as they might come up with questions and challenges that district leaders might not think of, experts say.