澳门跑狗论坛

Special Report
Artificial Intelligence Q&A

Why Schools Need to Talk About Racial Bias in AI-Powered Technologies

By Benjamin Herold 鈥 April 12, 2022 6 min read
Illustration of pop up windows and notifications of different programs and applications
  • Save to favorites
  • Print
Email Copy URL

Schools are embracing education technologies that use artificial intelligence for everything from teaching math to optimizing bus routes.

Their goals are to save money, personalize student learning, and free teachers from rote administrative tasks.

But how can educators know if the data and design processes those products rely on have been skewed by racial bias? And what happens if they鈥檙e afraid to ask?

鈥淚f you鈥檙e not breaking data down by subpopulations to really understand where harm might be caused, you鈥檙e perpetuating racist systems and structures that have been in place for centuries,鈥 said Sierra Noakes, the Edtech Marketplace Project Director at the nonprofit .

The group has teamed up with fellow nonprofit to offer a new product certification called . The idea is to provide both schools and companies with a common language and process for evaluating whether the makers of education technology tools are taking steps to:

  • Identify and question their own biases and assumptions.
  • Ensure the data used to train their artificial intelligence systems aren鈥檛 tainted by bias.
  • Provide educators and families alike with more visibility into how their products actually work and the risks they might pose.

The initiative comes on the heels of three other certifications offered by Digital Promise. Twenty-seven ed-tech products have already been recognized for addressing 鈥.鈥 Sixty-one have earned certification for 鈥,鈥 and one product has earned a recently launched certification regarding the use of learning analytics.

How concerned are you about the potential for racial bias in the educational technology used in schools?

As part of the pilot for the new racial equity certification, an undisclosed number of ed-tech companies volunteered to send at least one product through the review process. Findings will be made public in May.

The results will come at a moment when school districts around the country face significant backlash for their efforts to teach about racism and promote racial justice. Still, Noakes and Edtech Equity Project co-founders Madison Jacobs and Nidhi Hebbar said in a wide-ranging interview with 澳门跑狗论坛 that now is the time for a broader conversation about bias in AI.

鈥淐ompanies are taking advantage of schools鈥 misunderstandings of artificial intelligence,鈥 Jacobs said. 鈥淚t鈥檚 important for us to combat that.鈥

The conversation with Hebbar, Jacobs, and Noakes has been edited for length and clarity.

What are some examples of AI-powered ed-tech tools that could have blind spots or been designed using faulty or racist assumptions?

Hebbar: Personalized-learning software that adapts to the answers students give. Language-learning software that takes into account how students speak, but is often trained on data from native-English speakers born in America and treat other accents and dialects as wrong. Also, administrative tools that are used to identify who might be at risk for behavioral or academic reasons.

Jacobs: A lot of school districts are thinking about how they would utilize facial recognition technology. Those systems are clearly not built with the safety and security of Black folks in mind. It鈥檚 also all kind of connected. Students might get misidentified or mislabeled in one system, and that system then pushes the data to other systems.

Noakes: What we鈥檝e learned in creating the certification so far is that there鈥檚 tremendous resistance from [companies] to collecting and disaggregating data by race. They鈥檙e sort of living in this place where ignorance is bliss. But unless a tool is intentionally looking at their impact data, they鈥檙e engaging in a harmful structure.

What harms can result for students of color? Are they hypothetical, or are we already seeing them in actual schools?

Noakes: AI tools often make decisions on behalf of teachers without teachers having any insight into what variables are leading to those decisions. And in some instances, there鈥檚 not an override function. So, for example, grouping students. If those decisions are dictated by an algorithm, educators are blind to what the decisionmaking process looks like. These tools are making decisions they disagree with, but they can鈥檛 go to school leadership to say this tool is not taking something into account.

Hebbar: It鈥檚 not like, 鈥淎I is bad, full stop.鈥 But to be used in schools, it needs to be done in a way that allows educators who have a full understanding of a child to override [algorithmic decisions]. Giving users the ability to course-correct can mitigate a lot of the mistakes in technology, but we鈥檙e not seeing that with a lot of tools.

How aware of these problems are K-12 leaders?

Hebbar: It鈥檚 a mix. A lot of districts, especially the large ones with a lot of Black and brown students, are very aware of this. Others don鈥檛 even know which tools they鈥檙e purchasing that use AI. But what we generally see is that leaders often feel like, 鈥淚鈥檓 not an expert, I don鈥檛 even know what questions to ask.鈥 The idea behind the certification is to make that process simpler.

Jacobs: Part of the problem is there is no across-the-board policy that requires companies to disclose what is actually operating inside their technology. And I think a lot of leaders in schools still have the perception that if you鈥檙e basing things on numbers, then systems must be operating in a neutral fashion. But these systems are built by people, and therefore people鈥檚 inherent biases, their unconscious biases, their explicit biases go into the systems. A lot of our conversations are about educating people on a general level that these are not neutral systems.

How do companies respond to that message?

Jacobs: There are definitely companies that are sort of hiding behind the protections that the black box of AI offers. They may have engineering teams that are skewed toward white males. Bringing that to light is hard to do. They may not want to expose their shortcomings. There鈥檚 also this culture in tech about being the best and doing the most, and that really is a horrible barrier to thinking about how you design through the lens of the collective liberation of the people utilizing your technology. But some companies understand that legislation is coming [to address racial bias in tech] and that the venture capital ecosystem is shifting, with more funders wanting to invest in companies that treat equity as a core principle in their product development.

What steps do you want to see companies take to address these problems?

Hebbar: Looking at the collection of data and the types of students whose data are used to train a company鈥檚 system is a big one. But there are places for bias to come in at almost every stage of the product development and design process. The certification is built off a toolkit to help companies understand how to mitigate that. Even in the ideation phase, you really want to check, 鈥淲hat are our assumptions and blind spots?鈥 And then in the very last step, how are companies re-evaluating on a regular basis whether their products are effective?

There鈥檚 a backlash throughout the K-12 system to the way schools and districts are approaching racial equity. Many people just don鈥檛 agree with the idea that schools are systemically racist, and they鈥檒l likely disagree with the idea that educators and developers should be trying to undo the existing system. Why should they support this certification?

Jacobs: When you have groups of folks that would fight against a practice of looking into whether students are being treated fairly, that鈥檚 just another data point and proof that racism and bias exists. And if you don鈥檛 think that bias exists, then what is the problem for us to look and see?

Noakes: I believe to my core that when we鈥檙e designing systems and solutions for folks at the margins who are often ignored in design and development, we鈥檙e creating tools that help all learners thrive.

education week logo subbrand logo RC RGB

Data analysis for this article was provided by the EdWeek Research Center. Learn more about the center鈥檚 work.

A version of this article appeared in the April 13, 2022 edition of 澳门跑狗论坛 as Why Schools Need To Talk About Racial Bias In AI-Powered Technologies

Events

Artificial Intelligence K-12 Essentials Forum Big AI Questions for Schools. How They Should Respond鈥
Join this free virtual event to unpack some of the big questions around the use of AI in K-12 education.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of 澳门跑狗论坛's editorial staff.
Sponsor
School & District Management Webinar
Harnessing AI to Address Chronic Absenteeism in Schools
Learn how AI can help your district improve student attendance and boost academic outcomes.
Content provided by 
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of 澳门跑狗论坛's editorial staff.
Sponsor
Science Webinar
Spark Minds, Reignite Students & Teachers: STEM鈥檚 Role in Supporting Presence and Engagement
Is your district struggling with chronic absenteeism? Discover how STEM can reignite students' and teachers' passion for learning.
Content provided by 

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide 鈥 elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Artificial Intelligence Can AI Improve Literacy Outcomes for English Learners?
The federal government is funding a project that will explore AI's potential to improve English learners' early literacy skills.
2 min read
Ai translate language concept. Robot hand holds ai translator with blue background, Artificial intelligence chatbot equipped with a Language model technology.
Witsarut Sakorn/iStock
Artificial Intelligence Q&A Want to Try AI With English Learners? Here鈥檚 Where to Start
An English-learner researcher discusses what educators need to know before using the emerging technology.
5 min read
3D Illustration of a red and yellow speech bubble overlaying a circuitry blue background. The yellow bubble is empty while the red bubble shows the letters AI.
E+
Artificial Intelligence From Our Research Center AI Has Taken Classrooms by Storm. School Operations Could Be Next
Generative AI tools could help schools with operational tasks like budgeting, transportation, data analysis, and even zoning.
7 min read
Custom illustration by Stuart Briers showing a wrench that is filled with a blue abstract tech image of lines and dots, adjusting a cracked yellow school building. The light blue background reveals a subtle clock image.
Stuart Briers for 澳门跑狗论坛
Artificial Intelligence From Our Research Center Will AI Transform Standardized Testing?
AI has the potential to help usher in a new, deeper breed of state standardized tests, but there are plenty of reasons for caution.
10 min read
Custom illustration by Stuart Briers showing the silhouette of a female student wearing a backpack and with a tech dot matrix and ruler in the background. There is a speech bubble containing letters in different languages highlighted within a magnifying glass.
Stuart Briers for 澳门跑狗论坛