Early in 2018, ed-tech company Clever launched a new product aimed at helping tens of thousands of schools tackle a vexing challenge: determining whether students are actually using the software and learning apps on which the K-12 sector spends billions of dollars each year.
Less than 18 months later, though, it pulled the plug. Clever announced it was 鈥渟unsetting鈥 the effort and moving in a new direction.
What happened? Why did the effort flop? And what does Clever鈥檚 surprise pivot say about how serious schools are about keeping track of all the technology that has flooded public school classrooms?
鈥淲hat we found once we were in market was that it proved to be a nice-to-have more than a must-have for a lot of district leaders,鈥 Dan Carroll, the company鈥檚 co-founder and chief product officer, said in a July interview. 鈥淭hey were excited to try it out, but [tracking ed-tech usage] wasn鈥檛 one of their top three or five priorities for the year.鈥
A Multi-Pronged Problem
That analysis will likely come as a disappointment to those K-12 CTOs and CIOs who describe the opacity around ed-tech usage as a significant鈥攁nd multi-pronged鈥攑roblem.
If schools don鈥檛 know what digital and online learning tools teachers and students actually use, their thinking goes, how can they decide what to purchase and renew? How are they supposed to better focus scarce professional-development resources? How can they determine what ed-tech strategies will actually impact student achievement?
This special report鈥攖he first in a series of three special reports for the 2019-20 school year that 澳门跑狗论坛 is producing for K-12 ed-tech leaders鈥攅xamines how schools track tech usage and what steps they should take to make better use of educational technology tools. Read the full report here.
A number of products on the market aim to help. Companies such as LearnPlatform and BrightBytes, for example, offer analytics and dashboards that present information such as how many students log in to various software programs, how much time they spend using them, and what kind of progress they are making.
BrightBytes has tried to highlight the problem of scattershot usage. In 2018, for example, the company released an analysis based on how nearly 400,000 students in 48 districts used 177 different learning apps. It found that the typical district didn鈥檛 use 30 percent of the ed-tech licenses it purchased.
Given such realities, it was significant news when San Francisco-based Clever announced it was launching a new product called Goals. The company had already won the trust of roughly 60,000 schools by offering easy-to-use solutions to mundane problems, such as loading class rosters into learning software and allowing students to access the dozens of online tools with a single username and password.
The company鈥檚 new idea was similar: Give educators and administrators a single place to look across multiple learning tools at two simple metrics: how much time did students spend using each program, and how much progress did they make?
Clever officials were so convinced that districts would find such information valuable they made Goals their first service that schools would have to directly pay to use.
But a year after it was launched, the product had just 50 or so paying customers.
Worse, maintaining the service at even that limited scale proved to be a major time-suck for Clever鈥檚 engineers.
鈥淲e thought it would be very hard and expensive, which certainly proved true,鈥 Carroll said. 鈥淲e just couldn鈥檛 justify it financially.鈥
Unanticipated Complications
One of the school systems that tried Clever Goals, but ended up dropping it, was School District U-46 in Illinois.
It鈥檚 an ongoing challenge to make usage data relevant to both the specific tool being used and the specific way it鈥檚 being used in a given school or classroom, said Matt Raimondi, the district鈥檚 assessment and accountability coordinator, in an email.
Usage data can end up being pretty noisy or misleading. A usage report might show 20 percent of students using software X, and that could actually be 100 percent of the students that are supposed to be using it.鈥
For example, for one service, it might make sense to report how many licenses are actually used. For another, it might be more valuable to report out how much time students spent on-task, using the product. For a third, the most useful information might be the number of lessons or modules each student completes.
Further complicating matters, one school or district might use a tool for core instruction for all students, while another might use the same tool as remediation for a small subset of students.
鈥淯sage data can end up being pretty noisy or misleading in that regard,鈥 Raimondi said. 鈥淎 usage report might show 20 percent of students using software X, and that could actually be 100 percent of the students that are supposed to be using it.鈥
Clever鈥檚 Carroll said that was just one of several issues the company faced with Goals.
Even a single usage metric, such as time-on-task, proved nearly impossible to standardize, he said. How do you capture that with learning tools that require off-line work, for example?
And even when different companies were focused on the same data, he said, they often presented the information in different formats. That meant linking each software program into the Goals dashboard required a custom integration, which proved costly and time-consuming.
But the biggest hurdle by far, Carroll said, wasn鈥檛 technical at all. It was finding ways to help educators and administrators move beyond just looking at usage data to actually doing something useful with the information.
鈥淭hat kind of change management is really a Herculean effort,鈥 he said. 鈥淵ou need cultural buy-in, great training, and time allocated for people to look at the data and talk about it. We kind of underestimated the challenge there.鈥
鈥楶riorities of District Leaders鈥
Paige Kowalski isn鈥檛 one to say 鈥淚 told you so.鈥
But back when Clever Goals was launched, the executive vice president of the nonprofit Data Quality Campaign warned that real data-driven change only comes when educators and administrators get structured opportunities to turn information into action.
鈥淭here鈥檚 this belief out there that if we just get schools one more data point, it will really tell them something,鈥 Kowalski said in an July interview. 鈥淏ut it doesn鈥檛. It just begs more questions. I think that鈥檚 what Clever ran into.鈥
While the company has changed direction and pulled Goals off the market, it isn鈥檛 getting out of tracking ed-tech usage altogether.
Clever is now working on a free analytics tool that seeks to report student and teacher visits to online resources, including the amount of time they spent. The new product is in a testing phase at 300 schools. The company is eyeing early 2020 to make the product more widely available.
Once the tool is up and running, Clever users won鈥檛 be charged extra to use it, Carroll said. The company still isn鈥檛 prepared to offer teachers the kind of time and labor-intensive coaching needed to make usage data valuable for them, he said. So the new product will be focused on helping school and district leaders track aggregate usage and make system-level decisions.