澳门跑狗论坛

School & District Management

Ed-Tech Companies Should Open Algorithms to Scrutiny, Report Suggests

By Sarah Schwartz 鈥 August 17, 2017 2 min read
  • Save to favorites
  • Print
Email Copy URL

As personalized learning and adaptive learning tools in K-12 schools, researchers and educators are raising questions: What鈥檚 behind the algorithms that determine students鈥 growth and progress, and how can students鈥 personal and academic data be stored safely?

A from the University of Colorado鈥檚 probes these questions, calling for stronger regulations around the use of algorithms in personalized learning and the collection and storage of student data.

The researchers suggest that the swift and enthusiastic adoption of personalized and adaptive learning in some schools has come at the expense of student privacy. The algorithms that power ed-tech software need to be publicly available for educators and researchers to review, and companies and districts need to be held accountable if they violate students鈥 privacy, the report argues.

鈥淪cience and education both are supposed to be open processes, and open to discussion and evaluation,鈥 said Faith Boninger, a research associate at NEPC and the lead researcher on the report.

When personalized learning software uses proprietary algorithms, the report contends, developers are obscuring what data is used to evaluate students and how that evaluation process works. Educators have to trust that the criteria and method set up by the company is pedagogically and ethically sound.

This 鈥渂lack box鈥 could pose a problem, the researchers wrote.

Algorithmic bias, and the limitations of adaptive technology in general, .

Though it鈥檚 common to think of algorithms as neutral, factual tools, these formulas are designed by people and . The algorithms in educational software, the report reads, 鈥渞eflect the assumptions and biases of their developers and are subject to limitations in what the software can actually sort and measure.鈥

To combat this, NEPC suggests that legislation should require third-party evaluation of all software powered by adaptive technology. Any technology should be assessed for 鈥渧alidity and utility鈥 before it鈥檚 introduced in the classroom, the report argues.

Policies like these would slow down the tech adoption process, so that discriminatory algorithms or holes in companies鈥 data privacy policies can be identified. Boninger said NEPC wants to avoid students playing the role of 鈥済uinea pigs.鈥

鈥淎 better way to try to weed out unintended consequences,鈥 she said, 鈥渋s to carefully examine what you鈥檙e using before you start to use it.鈥

The report also calls for school-level policies that outline what data will be used for, how it will be protected, and when how it will be disposed of.

鈥淭here isn鈥檛 a hurry, really, to get these applications into schools,鈥 said Boninger. 鈥淲hat is really important is to protect the kids.鈥


See more:

Related Tags:

A version of this news article first appeared in the Digital Education blog.