At Pittsburgh鈥檚 Avonworth School District educators are experimenting with a new way to test digital tools they might buy for their classrooms.
In the past, the approach to such an ed-tech pilot project might have involved an administrator or teacher hearing a buzz about an app or software, trying it out in a class for some period of time, then recommending it based on whether students or teachers said they liked it. But in Avonworth this year, that process is more formal, with upfront planning, a relationship with the product vendor, and conclusions based on hard data.
The old way 鈥渨as more of an impulse buy,鈥 said Scott Miller, the principal of the Avonworth Primary Center, a K-2 school. 鈥淭hat鈥檚 not really effective. We want to make an educated, informed decision to see if a product is a fit for us.鈥
School districts routinely do some kind of testing to sample ed-tech products for their students and often invest in much of that technology. In 2014, pre-K-12 schools spent , according to the Software & Information Industry Association.
But the evaluations often look very different in different districts, or within the same district. They can be short-lived or long term, spread over several academic years. These trials can be an amorphous exercise with no defined way to determine what products are best.
鈥淚 see a lot of misunderstandings during this process,鈥 said Katrina Stevens, the deputy director of the office of educational technology at the . 鈥淚t鈥檚 ripe for improvement.鈥
As districts are inundated with ed-tech products that aim to solve their 鈥減ain points鈥 and claim to provide everything from personalized instruction to gamified content, finding ways to help districts run more effective pilot projects鈥攁nd ultimately make better spending decisions鈥攈as become a high priority.
More structured pilot projects are now being encouraged through a number of initiatives. For example, the Learning Assembly project, funded by the Bill & Melinda Gates Foundation, brings together seven organizations working to improve school and district ed-tech projects. (The Gates Foundation also provides support for 澳门跑狗论坛鈥檚 coverage of college- and career-ready standards and personalized learning.)
One of those organizations, the Washington-based nonprofit , which promotes the use of ed-tech in schools, is working with the 1,650-student Avonworth district, and Miller said that鈥檚 made a big difference.
Under the project, Avondale was paired with researchers from Pittsburgh鈥檚 Carnegie Mellon University who helped the district set up two elementary-grade pilots this academic year. ESpark, a personalized-learning program, is being used in 1st grade classes, while the digital toy Puzzlets is being sampled in grades K-2. Students used the tools from October through April.
District officials worked with researchers upfront to determine if the products were aligned with district needs, Miller said. And the district collaborated closely with both vendors to be sure teachers were using the products as intended. Teachers also provided feedback about how eSpark and Puzzlets worked or didn鈥檛, Miller said.
For eSpark, the district will primarily use student-growth and -achievement data to determine effectiveness. For Puzzlets, the district is strictly looking at engagement and student interest, Miller said. Out of this process, the district hopes to craft a system or checklist for pilot projects that can be replicated when its grant through the Gates Foundation runs out, Miller said.
鈥淭his is going to allow us to make an educated and informed decision on whether these products are a fit for us,鈥 he said. 鈥淚f they are, great, but do they need any tweaks? If not, we鈥檒l walk away, no harm, no foul.鈥
Financial concerns play a major role in the growing interest in creating more formal ed-tech pilot projects in schools.
Ed-tech products 鈥渃an be an enormous investment for a district,鈥 said Julia Freeland Fisher, the director of education research for the Clayton Christensen Institute, which studies blended learning. 鈥淭hey want to make sure they鈥檙e spending their scarce dollars wisely.鈥
The Education Department鈥檚 Stevens said her office is currently trying to improve rapid-cycle evaluations for ed-tech products by working to create an online pilot 鈥渨izard,鈥 akin to a TurboTax for school-product-testing projects, she said.
The digital toolkit would give districts a guide on the front end on how to do a needs assessment, the technicalities of rolling out a pilot, what questions to ask the product developer, and how to collect and analyze data to determine if a product should be used on a wider basis.
Currently, the department is creating a prototype of the pilot tool and will test it out in districts in the fall, she said.
鈥淲e want to walk a school or district leader through setting up a pilot and evaluating the tools being used in their system,鈥 Stevens said.
Other efforts to streamline the pilot-testing process are more regional. , a Chicago-based nonprofit that is also part of the Learning Assembly project, is working with schools to bridge the 鈥済ap between innovation and education,鈥 said CEO Phyllis Lockett.
LEAP partners with Chicago-area K-8 schools to match them with ed-tech companies seeking to pilot their products. 鈥淢any of our schools get calls from vendors constantly, and they don鈥檛 know where to start,鈥 she said.
LEAP has a panel of experts, including learning scientists, educators, and ed-tech investment experts, to evaluate and vet ed-tech products. Those that are approved are matched with individual Chicago-area schools for one-year pilots. Educators involved receive semester-long professional development before their project launches to hone their role in the process, Lockett said.
LEAP then works with researchers to crunch the data. 鈥淲e can tell schools if the solution moved the dial on achievement,鈥 Lockett said.
Digital Promise is working on pilots with several other districts in addition to Avonworth. It also plans to use the information it has gleaned to create a product-testing template, which can then be tailored to each district鈥檚 unique characteristics, said Aubrey Francisco, the organization鈥檚 research director. Digital Promise is also hoping to share the results of district pilot projects to provide information to other educators.
鈥淎 district might look at a study and say, 鈥業 feel comfortable using this product,鈥 based on the research done elsewhere,鈥 she said
Along those lines, Jefferson Education, a commercial entity advised by the University of Virginia鈥檚 Curry School of Education, hopes to build a system to share robust pilot-project information with valid data on a wider scale, so that every district doesn鈥檛 have to do its own test of a product, said CEO Bart Epstein. The project is in the beginning stages, he said.
鈥淰ery few schools have the bandwidth to be able to do pilots properly,鈥 he said. 鈥淩ight now there are probably 1,000 school districts all reviewing the same 15 math products.鈥
In a 2015 Digital Promise study, researchers found that districts鈥 prevailing processes for testing technology products are largely informal and often lack a clear approach and consistency. The study also found a disconnect between the aims of companies looking to test their products and schools.
鈥淭here鈥檚 a real need to have a more structured process to talk about what is needed, how to bring teachers in early so they buy in, how to work with the developer, implement properly, and measure success,鈥 Francisco said.
Some of these new pilot efforts may also help districts that have already purchased ed-tech software and digital tools in an ad hoc way. That鈥檚 the situation in the 1,200-student West Ada district in Meridian, Idaho, where 50 different math programs are being used across schools, said Eian Harm, the district鈥檚 research and data coordinator.
West Ada is, in effect, trying to do pilot projects in reverse on the most popular five math programs to determine which ones are most effective, Harm said.
To that end, a tool like the new EduStar platform might be of help, said Benjamin F. Jones, a professor of entrepreneurship and strategy at Northwestern University鈥檚 Kellogg School of Management, who is a co-creator.
EduStar, developed in collaboration with the nonprofit digital-learning provider PowerMyLearning, aims to provide rigorous and rapid trials of digital-learning tools and more granular content, like a lesson, video, or a game. Those trials can take just a few minutes鈥攖o test an app, for example鈥攁nd are done through an automated system, he said. Currently the system is being tested with 40 schools already using the PowerMyLearning platform, but Jones said he hopes to add many more that want to test out digital content.
The goal is to provide feedback to the developer about how a product works in a real classroom and to communicate deeper research about why or how certain games or techniques work or don鈥檛, Jones said.
鈥淚n the long run,鈥 he said, 鈥渨e hope the system can scale so it could test large numbers of digital-learning activities and provide a Consumer Reports function in the marketplace.鈥